Recent Noteworthy Physics Papers

In summary: The authors report on their search for CP-violating interactions and preferred-frame effects. They find that the interactions are not significant and that preferred-frame effects are not present. This paper is relevant to recent work on the torsion pendulum and the Sun.
  • #71
D. Salart et al., "Testing the speed of 'spooky action at a distance'", Nature v.454, p.861 (2008).

Abstract: Correlations are generally described by one of two mechanisms: either a first event influences a second one by sending information encoded in bosons or other physical carriers, or the correlated events have some common causes in their shared history. Quantum physics predicts an entirely different kind of cause for some correlations, named entanglement. This reveals itself in correlations that violate Bell inequalities (implying that they cannot be described by common causes) between space-like separated events (implying that they cannot be described by classical communication). Many Bell tests have been performed, and loopholes related to locality and detection have been closed in several independent experiments. It is still possible that a first event could influence a second, but the speed of this hypothetical influence (Einstein's 'spooky action at a distance') would need to be defined in some universal privileged reference frame and be greater than the speed of light. Here we put stringent experimental bounds on the speed of all such hypothetical influences. We performed a Bell test over more than 24 hours between two villages separated by 18 km and approximately east–west oriented, with the source located precisely in the middle. We continuously observed two-photon interferences well above the Bell inequality threshold. Taking advantage of the Earth's rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of the influence. For example, if such a privileged reference frame exists and is such that the Earth's speed in this frame is less than 10-3 times that of the speed of light, then the speed of the influence would have to exceed that of light by at least four orders of magnitude.

Also read the News and Views article in the same issue.

Edit: http://physicsworld.com/cws/article/news/35404" [Broken] can also be found at PhysicsWorld.

Zz.
 
Last edited by a moderator:
Physics news on Phys.org
  • #72
FUNKER said:
just want to say I fully AGREE! :!)


yes yes :))))
 
  • #73
C.G. Camara et al. "Correlation between nanosecond X-ray flashes and stick–slip friction in peeling tape", Nature v.455, p.1089 (2008).

Abstract: Relative motion between two contacting surfaces can produce visible light, called triboluminescence. This concentration of diffuse mechanical energy into electromagnetic radiation has previously been observed to extend even to X-ray energies. Here we report that peeling common adhesive tape in a moderate vacuum produces radio and visible emission along with nanosecond, 100-mW X-ray pulses that are correlated with stick–slip peeling events. For the observed 15-keV peak in X-ray energy, various models give a competing picture of the discharge process, with the length of the gap between the separating faces of the tape being 30 or 300 mum at the moment of emission. The intensity of X-ray triboluminescence allowed us to use it as a source for X-ray imaging. The limits on energies and flash widths that can be achieved are beyond current theories of tribology.

This thing has been getting a lot of popular media coverage because the simple act of peeling an ordinary scotch tape in moderate vacuum can actually generate a small amount of short x-ray burst.

Zz.
 
  • #74
Event-by-Event Simulation of Einstein-Podolsky-Rosen-Bohm Experiments:

http://www.springerlink.com/content/p28v88867w7213mu/ Open Access
http://arxiv.org/pdf/0712.3693

Abstract We construct an event-based computer simulation model of the Einstein-Podolsky-Rosen-Bohm experiments with photons. The algorithm is a one-to-one copy of the data gathering and analysis procedures used in real laboratory experiments. We consider two types of experiments, those with a source emitting photons with opposite but otherwise unpredictable polarization and those with a source emitting photons with fixed polarization. In the simulation, the choice of the direction of polarization measurement for each detection event is arbitrary. We use three different procedures to identify pairs of photons and compute the frequency of coincidences by analyzing experimental data and simulation data. The model strictly satisfies Einstein’s criteria of local causality, does not rely on any concept of quantum theory and reproduces the results of quantum theory for both types of experiments. We give a rigorous proof that the probabilistic description of the simulation model yields the quantum theoretical expressions for the single- and two-particle expectation values.​
 
  • #75
A. Fragner et al. "Resolving Vacuum Fluctuations in an Electrical Circuit by Measuring the Lamb Shift", Science v.322, p.1357 (2008).

Abstract: Quantum theory predicts that empty space is not truly empty. Even in the absence of any particles or radiation, in pure vacuum, virtual particles are constantly created and annihilated. In an electromagnetic field, the presence of virtual photons manifests itself as a small renormalization of the energy of a quantum system, known as the Lamb shift. We present an experimental observation of the Lamb shift in a solid-state system. The strong dispersive coupling of a superconducting electronic circuit acting as a quantum bit (qubit) to the vacuum field in a transmission-line resonator leads to measurable Lamb shifts of up to 1.4% of the qubit transition frequency. The qubit is also observed to couple more strongly to the vacuum field than to a single photon inside the cavity, an effect that is explained by taking into account the limited anharmonicity of the higher excited qubit states.

An amazing feat to detect a Lamb shift in a many-body system such an a superconducting electronic circuit.

Zz.
 
  • #76
J. S. Lundeen and A. M. Steinberg, "Experimental Joint Weak Measurement on a Photon Pair as a Probe of Hardy's Paradox", Phys. Rev. Lett. 102, 020404 (2009).

Abstract: It has been proposed that the ability to perform joint weak measurements on postselected systems would allow us to study quantum paradoxes. These measurements can investigate the history of those particles that contribute to the paradoxical outcome. Here we experimentally perform weak measurements of joint (i.e., nonlocal) observables. In an implementation of Hardy's paradox, we weakly measure the locations of two photons, the subject of the conflicting statements behind the paradox. Remarkably, the resulting weak probabilities verify all of these statements but, at the same time, resolve the paradox.

This experiment appears to be the confirmation and resolution of the Hardy's paradox.

A news article on this http://exchangemagazine.com/morningpost/2009/week3/Friday/0116014.htm" [Broken].

Zz.
 
Last edited by a moderator:
  • #77
A. Cabello et al., "Proposed Bell Experiment with Genuine Energy-Time Entanglement", Phys. Rev. Lett. v.102, p.040401 (2009).

Abstract: Franson's Bell experiment with energy-time entanglement [Phys. Rev. Lett. 62, 2205 (1989)] does not rule out all local hidden variable models. This defect can be exploited to compromise the security of Bell inequality-based quantum cryptography. We introduce a novel Bell experiment using genuine energy-time entanglement, based on a novel interferometer, which rules out all local hidden variable models. The scheme is feasible with actual technology.

Zz.
 
  • #78
this is a recent paper addressing the origin of spin glass in hole-doped cuprate superconductors. The author attempts a new mechanism for spin glass that can live with Zhang-Rice singlet states. The paper is located at

J. Phys.: Condens. Matter 21 (2009) 075702

Abstract: To address the incompatibility of Zhang–Rice singlet formation and the observed spin glass behavior, an effective model is proposed for the electronic behavior of cuprate materials. The model includes an antiferromagnetic interaction between the spin of the hole in a Zhang–Rice orbital and the spin of the hole on the corresponding copper site. While in the large interaction limit this recovers the t–J model, in the low energy limit the Zhang–Rice singlets are deformed. It is also shown that such deformation can induce random defect ferromagnetic (FM) bonds between adjacent local spins, an effect herein referred to as unusual double exchange, and then spin glass behavior shall result in the case of localized holes. A derivation of the model is also presented.
 
  • #79
V. Moshchalkov et al., "Type-1.5 Superconductivity" Phys. Rev. Lett. 102, 117001 (2009)

Abstract: We demonstrate the existence of a novel superconducting state in high quality two-component MgB2 single crystalline superconductors where a unique combination of both type-1 (lambda1/xi1<1/sqrt(2)) and type-2 (lambda2/xi2>1/sqrt(2)) superconductor conditions is realized for the two components of the order parameter. This condition leads to a vortex-vortex interaction attractive at long distances and repulsive at short distances, which stabilizes unconventional stripe- and gossamerlike vortex patterns that we have visualized in this type-1.5 superconductor using Bitter decoration and also reproduced in numerical simulations.

If this is true, they have found a new phase of superconductivity where both Type I and Type II properties resides in the same material (but in different bands).

You may also read a http://physics.aps.org/articles/v2/22" [Broken] AND, get a free copy of the exact paper.

Zz.
 
Last edited by a moderator:
  • #80
D. Gross, S.T. Flammia and J. Eisert, "Most Quantum States Are Too Entangled To Be Useful As Computational Resources" Phys. Rev. Lett. 102, 190501 (2009)

Abstract: It is often argued that entanglement is at the root of the speedup for quantum compared to classical computation, and that one needs a sufficient amount of entanglement for this speedup to be manifest. In measurement-based quantum computing, the need for a highly entangled initial state is particularly obvious. Defying this intuition, we show that quantum states can be too entangled to be useful for the purpose of computation, in that high values of the geometric measure of entanglement preclude states from offering a universal quantum computational speedup. We prove that this phenomenon occurs for a dramatic majority of all states: the fraction of useful n-qubit pure states is less than exp(-n2). This work highlights a new aspect of the role entanglement plays for quantum computational speedups.

Quantum computers are still far from realization. Usually fast decoherence and the problem of producing high degrees of entanglement between large numbers of qubits are mentioned as the first big problems, which one thinks of. Now Gross et al. show that most highly entangled quantum states will not provide a significant increase in computational speed compared to classical computers. So it might be necessary in future to identify and understand the few remaining entangled states, which are indeed useful for computation.

There is also an accompanying viewpoint to this article: http://link.aip.org/link/?&l_creator=getabs-normal&l_dir=REV&l_rel=VIEWPOINT&from_key=PRLTAO000102000019190501000001&from_keyType=CVIPS&from_loc=AIP&to_j=PHYSGM&to_v=2&to_p=38&to_loc=APS&to_url=http%3A%2F%2Flink.aps.org%2Fdoi%2F10.1103%2FPhysics.2.38 [Broken]
 
Last edited by a moderator:
  • #81
M. Gu et al. "http://arxiv.org/abs/0809.0151" [Broken]", Physica D: Nonlinear Phenomena, v.238, p.835 (2009).

Abstract: In 1972, P.W. Anderson suggested that ‘More is Different’, meaning that complex physical systems may exhibit behavior that cannot be understood only in terms of the laws governing their microscopic constituents. We strengthen this claim by proving that many macroscopic observable properties of a simple class of physical systems (the infinite periodic Ising lattice) cannot in general be derived from a microscopic description. This provides evidence that emergent behavior occurs in such systems, and indicates that even if a ‘theory of everything’ governing all microscopic interactions were discovered, the understanding of macroscopic order is likely to require additional insights.

Read the News and Views article on this paper in Nature 459, 332-334 (21 May 2009).

Edit: read the http://arxiv.org/abs/0809.0151" [Broken] here.

Zz.
 
Last edited by a moderator:
  • #82
A. V. Ponomarev et al., "ac-Driven Atomic Quantum Motor", Phys. Rev. Lett. v.102, p.230601 (2009) .

Abstract: We propose an ac-driven quantum motor consisting of two different, interacting ultracold atoms placed into a ring-shaped optical lattice and submerged in a pulsating magnetic field. While the first atom carries a current, the second one serves as a quantum starter. For fixed zero-momentum initial conditions the asymptotic carrier velocity converges to a unique nonzero value. We also demonstrate that this quantum motor performs work against a constant load.

A review of this paper can also be found at the http://sciencenow.sciencemag.org/cgi/content/full/2009/609/1".

Zz.
 
Last edited by a moderator:
  • #83
R. Horodecki et al., "Quantum Entanglement", Rev. Mod. Phys. v.81, p865 (2009).

Abstract: From the point of view of quantum information science, entanglement is a resource that can be used to perform tasks that are impossible in a classical world. In a certain sense, the more entanglement we have, the better we can perform those tasks. Thus, one of the main goals in this field has been to identify under which conditions two or more systems are entangled, and how entangled they are. This paper reviews the main criteria to detect entanglement as well as entanglement measures and also discusses the role of entanglement in quantum communication and cryptography.

This is a HUGE, 78-page review of quantum entanglement. We get a lot of frequent questions on this topic, so it is appropriate to post a source that has a wealth of information and references.

The Arxiv version of this paper http://arxiv.org/abs/quant-ph/0702225" [Broken].

Zz.
 
Last edited by a moderator:
  • #84
M. Karski et al., "Quantum Walk in Position Space with Single Optically Trapped Atoms", Science v.325, p. 174 (2009).

Abstract: The quantum walk is the quantum analog of the well-known random walk, which forms the basis for models and applications in many realms of science. Its properties are markedly different from the classical counterpart and might lead to extensive applications in quantum information science. In our experiment, we implemented a quantum walk on the line with single neutral atoms by deterministically delocalizing them over the sites of a one-dimensional spin-dependent optical lattice. With the use of site-resolved fluorescence imaging, the final wave function is characterized by local quantum state tomography, and its spatial coherence is demonstrated. Our system allows the observation of the quantum-to-classical transition and paves the way for applications, such as quantum cellular automata.

Read the http://sciencenow.sciencemag.org/cgi/content/full/2009/710/2".

Zz.
 
Last edited by a moderator:
  • #85
M. Aßmann et. al., "Higher-Order Photon Bunching in a Semiconductor Microcavity", Science v.325, p.297 (2009).

Abstract: Quantum mechanically indistinguishable particles such as photons may show collective behavior. Therefore, an appropriate description of a light field must consider the properties of an assembly of photons instead of independent particles. We have studied multiphoton correlations up to fourth order in the single-mode emission of a semiconductor microcavity in the weak and strong coupling regimes. The counting statistics of single photons were recorded with picosecond time resolution, allowing quantitative measurement of the few-photon bunching inside light pulses. Our results show bunching behavior in the strong coupling case, which vanishes in the weak coupling regime as the cavity starts lasing. In particular, we verify the n factorial prediction for the zero-delay correlation function of n thermal light photons.

The bunching and anti-bunching phenomena are considered to be THE strongest evidence for photons. These have no classical equivalence.

Zz.
 
  • #86
G. Kirchmair et al., "State-independent experimental test of quantum contextuality", Nature v.460, p.494 (2009).

Abstract: The question of whether quantum phenomena can be explained by classical models with hidden variables is the subject of a long-lasting debate. In 1964, Bell showed that certain types of classical models cannot explain the quantum mechanical predictions for specific states of distant particles, and some types of hidden variable models have been experimentally ruled out. An intuitive feature of classical models is non-contextuality: the property that any measurement has a value independent of other compatible measurements being carried out at the same time. However, a theorem derived by Kochen, Specker and Bell shows that non-contextuality is in conflict with quantum mechanics. The conflict resides in the structure of the theory and is independent of the properties of special states. It has been debated whether the Kochen–Specker theorem could be experimentally tested at al. First tests of quantum contextuality have been proposed only recently, and undertaken with photons and neutrons. But these tests required the generation of special quantum states and left various loopholes open. Here we perform an experiment with trapped ions that demonstrates a state-independent conflict with non-contextuality. The experiment is not subject to the detection loophole and we show that, despite imperfections and possible measurement disturbances, our results cannot be explained in non-contextual terms.

Zz.
 
  • #87
Y. Jompol et al., "Probing Spin-Charge Separation in a Tomonaga-Luttinger Liquid, Science v.325 p.597 (2009).

Abstract: In a one-dimensional (1D) system of interacting electrons, excitations of spin and charge travel at different speeds, according to the theory of a Tomonaga-Luttinger liquid (TLL) at low energies. However, the clear observation of this spin-charge separation is an ongoing challenge experimentally. We have fabricated an electrostatically gated 1D system in which we observe spin-charge separation and also the predicted power-law suppression of tunneling into the 1D system. The spin-charge separation persists even beyond the low-energy regime where the TLL approximation should hold. TLL effects should therefore also be important in similar, but shorter, electrostatically gated wires, where interaction effects are being studied extensively worldwide.

Just imagine - a charge carrier (say an electron), somehow behaves as if it's spin and its charge have been fractionalized, and thus, move differently. This is what spin-charge separation is. It is one of those fundamental phenomena in condensed matter physics that isn't observed anywhere else, but is something that could potentially be a fundamental principle in the physics of elementary particles.

Previous experiments have shown signatures of such spin-charge separation. It has been shown that the charge and thermal currents in 1D organic conductors violate the Wiedemann-Franz law, an indication of a possible spin-charge separation. The charge current had a different dispersion than the thermal currents, something you don't find in a standard Solid State Physics text.

In this new experiment, a different type of experiment was done - tunneling into a 1D system. There appears to be clear signatures of the spin-charge separation in the tunneling currents that were observed.

Zz.
 
  • #88
S. S. Hodgman et al., "Metastable Helium: A New Determination of the Longest Atomic Excited-State Lifetime", Phys. Rev. Lett. v.103, p.053002 (2009).

Abstract: Exited atoms may relax to the ground state by radiative decay, a process which is usually very fast (of order nanoseconds). However, quantum-mechanical selection rules can prevent such rapid decay, in which case these “metastable” states can have lifetimes of order seconds or longer. In this Letter, we determine experimentally the lifetime of the longest-lived neutral atomic state—the first excited state of helium (the [itex]2 ^3S_1[/itex] metastable state)—to the highest accuracy yet measured. We use laser cooling and magnetic trapping to isolate a cloud of metastable helium (He*) atoms from their surrounding environment, and measure the decay rate to the ground [itex]1 ^1S_0[/itex] state via extreme ultraviolet (XUV) photon emission. This is the first measurement using a virtually unperturbed ensemble of isolated helium atoms, and yields a value of 7870(510) seconds, in excellent agreement with the predictions of quantum electrodynamic theory.

Whoa! That's more than 2 hours!

Zz.
 
  • #89
Z. Bern et al., "Ultraviolet Behavior of N=8 Supergravity at Four Loops", Phys. Rev. Lett. 103, 081301 (2009).

Abstract: We describe the construction of the complete four-loop four-particle amplitude of N=8 supergravity. The amplitude is ultraviolet finite, not only in four dimensions, but in five dimensions as well. The observed extra cancellations provide additional nontrivial evidence that N=8 supergravity in four dimensions may be ultraviolet finite to all orders of perturbation theory.

Read a review of this work AND get http://physics.aps.org/articles/v2/70" [Broken].

Zz.
 
Last edited by a moderator:
  • #90
L. Maccone "Quantum Solution to the Arrow-of-Time Dilemma", Phys. Rev. Lett. 103, 080401 (2009).

Abstract: The arrow-of-time dilemma states that the laws of physics are invariant for time inversion, whereas the familiar phenomena we see everyday are not (i.e., entropy increases). I show that, within a quantum mechanical framework, all phenomena which leave a trail of information behind (and hence can be studied by physics) are those where entropy necessarily increases or remains constant. All phenomena where the entropy decreases must not leave any information of their having happened. This situation is completely indistinguishable from their not having happened at all. In the light of this observation, the second law of thermodynamics is reduced to a mere tautology: physics cannot study those processes where entropy has decreased, even if they were commonplace.

Read the Focus article on this paper here:

http://focus.aps.org/story/v24/st7

Zz.
 
  • #91
laurencn106 said:
D J Kapner et al. " Tests of the Gravitational Inverse-Square Law below the Dark-Energy Length Scale", Phys. Rev. Lett. 98 021101 (2007)

Abstract: We conducted three torsion-balance experiments to test the gravitational inverse-square law at separations between 9.53 mm and 55 µm, probing distances less than the dark-energy length scale lambdad=radical(radix(4)[h-bar]c/rho[sub d])[approximate]85 µm. We find with 95% confidence that the inverse-square law holds (|alpha|<=1) down to a length scale lambda=56 µm and that an extra dimension must have a size R<=44 µm.

Thanks for your contribution, but this paper was highlighted already 2 years ago here:

https://www.physicsforums.com/showpost.php?p=1301594&postcount=39

As per the "theme" of this thread, we try to highlight papers within the past year. If you are unsure if a paper has been highlighted here already, do a search on the thread on the first author's name.

Zz.
 
  • #92
S. Rao et al. "Measurement of Mechanical Forces Acting on Optically Trapped Dielectric Spheres Induced by Surface-Enhanced Raman Scattering, Phys. Rev. Lett. v.102, p.087401 (2009).

Abstract: Surface enhanced Raman scattering (SERS) is studied from optically trapped dielectric spheres partially covered with silver colloids in a solution with SERS active molecules. The Raman scattering and Brownian motion of the sphere are simultaneously measured to reveal correlations between the enhancement of the Raman signal and average position of the sphere. The correlations are due to the momenta transfer of the emitted Raman photons from the probe molecules. The addition of a mechanical force measurement provides a different dimension to the study of Raman processes.

You may also read the Physical Review http://focus.aps.org/story/v24/st12" [Broken].

Zz.
 
Last edited by a moderator:
  • #93
A. J. Bennett et al., "Interference of dissimilar photon sources, Nature Physics v.5 p.715-717 (2009).

Abstract: If identical photons meet at a semi-transparent mirror they seem to leave in the same direction, an effect called 'two-photon interference'. It has been known for some time that this effect should occur for photons generated by dissimilar sources with no common history, provided the measurement cannot distinguish between the photons. Here, we report a technique for observing such interference with isolated, unsynchronized sources for which the coherence times differ by several orders of magnitude. In an experiment we cause photons generated by different physical processes, with different photon statistics, to interfere. One of the sources is stimulated emission from a tunable laser, which has Poissonian statistics and a nanoelectronvolt bandwidth. The other is spontaneous emission from a quantum dot in a p–i–n diode with a few-microelectronvolt linewidth. We develop a theory to explain the visibility of interference, which is primarily limited by the timing resolution of our detectors.

It is well known that there is a close connection between indistinguishability and interference. Therefore recently there have been lots of efforts to test to which extent distinguishable photons can be made indistinguishable in terms of an experiment. This has been shown in several systems before, including single atoms, ions, consecutive single photons from single quantum dots and even different semiconductor nanostructures. Bennett et al. now prove that even photons from completely different light sources can show two-photon interference.
 
  • #94
is there any papers published at the high-school level?
 
  • #95
Quantum Zeno effect explains magnetic-sensitive radical-ion-pair reactions , Phys. Rev. E 80, 056115 (2009) - http://arxiv.org/abs/0806.0739

Abstract:Chemical reactions involving radical-ion pairs are ubiquitous in biology, since not only are they at the basis of the photosynthetic reaction chain, but are also assumed to underlie the biochemical magnetic compass used by avian species for navigation. Recent experiments with magnetic-sensitive radical-ion-pair reactions provided strong evidence for the radical-ion-pair magnetoreception mechanism, verifying the expected magnetic sensitivities and chemical product yield changes. It is here shown that the theoretical description of radical-ion-pair reactions used since the 70s cannot explain the observed data, because it is based on phenomenological equations masking quantum coherence effects. The fundamental density-matrix equation derived here from basic quantum measurement theory considerations naturally incorporates the quantum Zeno effect and readily explains recent experimental observations on low- and high magnetic-field radical-ion-pair reactions.
 
Last edited:
  • #96
R. Gerritsma et al. "Quantum simulation of the Dirac equation", Nature v.463, p.68 (2010) .

Abstract: The Dirac equation successfully merges quantum mechanics with special relativity. It provides a natural description of the electron spin, predicts the existence of antimatter and is able to reproduce accurately the spectrum of the hydrogen atom. The realm of the Dirac equation—relativistic quantum mechanics—is considered to be the natural transition to quantum field theory. However, the Dirac equation also predicts some peculiar effects, such as Klein’s paradox and ‘Zitterbewegung’, an unexpected quivering motion of a free relativistic quantum particle. These and other predicted phenomena are key fundamental examples for understanding relativistic quantum effects, but are difficult to observe in real particles. In recent years, there has been increased interest in simulations of relativistic quantum effects using different physical set-ups in which parameter tunability allows access to different physical regimes. Here we perform a proof-of-principle quantum simulation of the one-dimensional Dirac equation using a single trapped ion set to behave as a free relativistic quantum particle. We measure the particle position as a function of time and study Zitterbewegung for different initial superpositions of positive- and negative-energy spinor states, as well as the crossover from relativistic to non-relativistic dynamics. The high level of control of trapped-ion experimental parameters makes it possible to simulate textbook examples of relativistic quantum physics.

Zz.
 
  • #97
R.B. Lanyon et al., "Towards quantum chemistry on a quantum computer" Nature Chemistry v.2, p.106 (2009).

Abstract: Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.

You can read a http://www.wired.com/wiredscience/2010/01/quantum-computer-hydrogen-simulation/" [Broken].

Zz.
 
Last edited by a moderator:
  • #98
D. W. Berry, et al., "Fair-sampling assumption is not necessary for testing local realism" Phys. Rev. A 81, 012109 (2010).

Abstract: Almost all Bell inequality experiments to date have used postselection and therefore relied on the fair sampling assumption for their interpretation. The standard form of the fair sampling assumption is that the loss is independent of the measurement settings, so the ensemble of detected systems provides a fair statistical sample of the total ensemble. This is often assumed to be needed to interpret Bell inequality experiments as ruling out hidden-variable theories. Here we show that it is not necessary; the loss can depend on measurement settings, provided the detection efficiency factorizes as a function of the measurement settings and any hidden variable. This condition implies that Tsirelson’s bound must be satisfied for entangled states. On the other hand, we show that it is possible for Tsirelson’s bound to be violated while the Clauser-Horne-Shimony-Holt (CHSH)-Bell inequality still holds for unentangled states, and present an experimentally feasible example.

Although I do not care much about interpretational issues and all that nonlocality vs. local realism stuff, a lot of people around here do. Therefore some people on these forums might be interested in this formal treatment on the meaning of fair sampling.
 
  • #99
Holger Müller, Achim Peters, & Steven Chu "A precision measurement of the gravitational redshift by the interference of matter waves", Nature v.463, p.926 (2010).

Abstract: One of the central predictions of metric theories of gravity, such as general relativity, is that a clock in a gravitational potential U will run more slowly by a factor of 1 + U/c^2, where c is the velocity of light, as compared to a similar clock outside the potential. This effect, known as gravitational redshift, is important to the operation of the global positioning system, timekeeping and future experiments with ultra-precise, space-based clocks (such as searches for variations in fundamental constants). The gravitational redshift has been measured using clocks on a tower, an aircraft and a rocket, currently reaching an accuracy of 7 × 10^-5. Here we show that laboratory experiments based on quantum interference of atoms enable a much more precise measurement, yielding an accuracy of 7 × 10^-9. Our result supports the view that gravity is a manifestation of space-time curvature, an underlying principle of general relativity that has come under scrutiny in connection with the search for a theory of quantum gravity. Improving the redshift measurement is particularly important because this test has been the least accurate among the experiments that are required to support curved space-time theories.

You may read a report on this work at the http://physicsworld.com/cws/article/news/41740" [Broken].

Also, note the name of one of the authors of this paper. There is a "Steven Chu", who is currently the Secretary of the US Dept. of Energy! :)

Zz.
 
Last edited by a moderator:
  • #100
H. Shishido et al., "Tuning the Dimensionality of the Heavy Fermion Compound CeIn3" Science v.327, p.980 (2010).

Abstract: Condensed-matter systems that are both low-dimensional and strongly interacting often exhibit unusual electronic properties. Strongly correlated electrons with greatly enhanced effective mass are present in heavy fermion compounds, whose electronic structure is essentially three-dimensional. We realized experimentally a two-dimensional heavy fermion system, adjusting the dimensionality in a controllable fashion. Artificial superlattices of the antiferromagnetic heavy fermion compound CeIn3 and the conventional metal LaIn3 were grown epitaxially. By reducing the thickness of the CeIn3 layers, the magnetic order was suppressed and the effective electron mass was further enhanced. Heavy fermions confined to two dimensions display striking deviations from the standard Fermi liquid low-temperature electronic properties, and these are associated with the dimensional tuning of quantum criticality.

Also see the Perspective article by Piers Coleman in the same issue.

This is a very interesting work since now, the "parameter" that is controlling the quantum phase transition is the dimensionality: 3D to 2D.

Zz.
 
  • #101
R. Reyes et al., "Confirmation of general relativity on large scales from weak lensing and galaxy velocities", Nature v.464, p.256 (2010).

Abstract: Although general relativity underlies modern cosmology, its applicability on cosmological length scales has yet to be stringently tested. Such a test has recently been proposed, using a quantity, E G, that combines measures of large-scale gravitational lensing, galaxy clustering and structure growth rate. The combination is insensitive to ‘galaxy bias’ (the difference between the clustering of visible galaxies and invisible dark matter) and is thus robust to the uncertainty in this parameter. Modified theories of gravity generally predict values of E G different from the general relativistic prediction because, in these theories, the ‘gravitational slip’ (the difference between the two potentials that describe perturbations in the gravitational metric) is non-zero, which leads to changes in the growth of structure and the strength of the gravitational lensing effect. Here we report that E G = 0.39 ± 0.06 on length scales of tens of megaparsecs, in agreement with the general relativistic prediction of E G ≈ 0.4. The measured value excludes a model1 within the tensor–vector–scalar gravity theory which modifies both Newtonian and Einstein gravity. However, the relatively large uncertainty still permits models within f(R) theory, which is an extension of general relativity. A fivefold decrease in uncertainty is needed to rule out these models.

Edit: See PhysicsWorld coverage of this:

http://physicsworld.com/cws/article/news/41948

Zz.
 
Last edited:
  • #102
Y. Kajiwara et al., "Transmission of electrical signals by spin-wave interconversion in a magnetic insulator" Nature v.464, p.262 (2010).

Abstract: The energy bandgap of an insulator is large enough to prevent electron excitation and electrical conduction. But in addition to charge, an electron also has spin, and the collective motion of spin can propagate—and so transfer a signal—in some insulators. This motion is called a spin wave and is usually excited using magnetic fields. Here we show that a spin wave in an insulator can be generated and detected using spin-Hall effects, which enable the direct conversion of an electric signal into a spin wave, and its subsequent transmission through (and recovery from) an insulator over macroscopic distances. First, we show evidence for the transfer of spin angular momentum between an insulator magnet Y3Fe5O12 and a platinum film. This transfer allows direct conversion of an electric current in the platinum film to a spin wave in the Y3Fe5O12 via spin-Hall effects. Second, making use of the transfer in a Pt/Y3Fe5O12/Pt system, we demonstrate that an electric current in one metal film induces voltage in the other, far distant, metal film. Specifically, the applied electric current is converted into spin angular momentum owing to the spin-Hall effect in the first platinum film; the angular momentum is then carried by a spin wave in the insulating Y3Fe5O12 layer; at the distant platinum film, the spin angular momentum of the spin wave is converted back to an electric voltage. This effect can be switched on and off using a magnetic field. Weak spin damping3 in Y3Fe5O12 is responsible for its transparency for the transmission of spin angular momentum. This hybrid electrical transmission method potentially offers a means of innovative signal delivery in electrical circuits and devices.

This appears to be the first instance of electrical signal being transmitted via spin waves. This should bring the possiblity of spintronics a step closer to reality.

Zz.
 
  • #103
P.J. Mohr and D.B. Newell, "The physics of fundamental constants", Am. J. Phys. v.78, p.338 (2010).

Abstract: This Resource Letter provides a guide to the literature on the physics of fundamental constants and their values as determined within the International System of Units (SI). Journal articles, books, and websites that provide relevant information are surveyed. Literature on redefining the SI in terms of exact values of fundamental constants is also included.

A very useful paper to have and to keep. Not only does it describe all of the major fundamental constants of our universe that we know of so far, but it also describes how they are measured/determined, AND gives you a boatload of references along with each of these constants.

Zz.
 
  • #104
M. Rypdal and K. Rypdal, "Testing Hypotheses about Sun-Climate Complexity Linking", Phys. Rev. Lett. v.104, p.128501 (2010).

Abstract: We reexamine observational evidence presented in support of the hypothesis of a sun-climate complexity linking by N. Scafetta and B. J. West, Phys. Rev. Lett. 90, 248701 (2003), which contended that the integrated solar flare index (SFI) and the global temperature anomaly (GTA) both follow Lévy walk statistics with the same waiting-time exponent μ≈2.1. However, their analysis does not account for trends in the signal, cannot deal correctly with infinite variance processes (Lévy flights), and suffers from considering only the second moment. Our analysis shows that properly detrended, the integrated SFI is well described as a Lévy flight, and the integrated GTA as a persistent fractional Brownian motion. These very different stochastic properties of the solar and climate records do not support the hypothesis of a sun-climate complexity linking.

The preprint of the manuscript http://complexityandplasmas.net/Preprints_files/sun-climate%20complexity%20link.pdf" [Broken].

Zz.
 
Last edited by a moderator:
  • #105
S. Pironio et al., "Random numbers certified by Bell’s theorem", Nature v.464, p.1021 (2010).

Abstract: Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation15. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.

Zz.
 

Similar threads

  • Sticky
  • Other Physics Topics
Replies
5
Views
16K
  • STEM Academic Advising
Replies
25
Views
2K
  • Quantum Interpretations and Foundations
Replies
3
Views
1K
  • Quantum Interpretations and Foundations
Replies
5
Views
2K
Replies
50
Views
4K
  • High Energy, Nuclear, Particle Physics
Replies
7
Views
1K
  • Beyond the Standard Models
Replies
11
Views
2K
  • Atomic and Condensed Matter
Replies
2
Views
1K
Replies
16
Views
1K
  • Other Physics Topics
Replies
5
Views
3K
Back
Top