Recent Noteworthy Physics Papers

  • Thread starter Thread starter ZapperZ
  • Start date Start date
  • Tags Tags
    Papers Physics
AI Thread Summary
the thread highlights recent noteworthy physics papers published in respected peer-reviewed journals, emphasizing the importance of providing full references and abstracts. Key papers discussed include a new determination of the fine structure constant through precise measurements and QED calculations, a detailed model of pebble erosion, and experimental investigations into entangled measurements that challenge local realism. Other significant contributions involve studies on the behavior of mesoscopic circuits and the interplay between electron-lattice interactions and superconductivity in high-Tc materials. The thread serves as a resource for sharing and recognizing impactful research in the field of physics.
  • #51
S. Fölling et al., "Direct observation of second-order atom tunnelling", Nature v.448, p.1029 (2007).

Abstract: Tunnelling of material particles through a classically impenetrable barrier constitutes one of the hallmark effects of quantum physics. When interactions between the particles compete with their mobility through a tunnel junction, intriguing dynamical behaviour can arise because the particles do not tunnel independently. In single-electron or Bloch transistors, for example, the tunnelling of an electron or Cooper pair can be enabled or suppressed by the presence of a second charge carrier due to Coulomb blockade. Here we report direct, time-resolved observations of the correlated tunnelling of two interacting ultracold atoms through a barrier in a double-well potential. For the regime in which the interactions between the atoms are weak and tunnel coupling dominates, individual atoms can tunnel independently, similar to the case of a normal Josephson junction. However, when strong repulsive interactions are present, two atoms located on one side of the barrier cannot separate, but are observed to tunnel together as a pair in a second-order co-tunnelling process. By recording both the atom position and phase coherence over time, we fully characterize the tunnelling process for a single atom as well as the correlated dynamics of a pair of atoms for weak and strong interactions. In addition, we identify a conditional tunnelling regime in which a single atom can only tunnel in the presence of a second particle, acting as a single atom switch. Such second-order tunnelling events, which are the dominating dynamical effect in the strongly interacting regime, have not been previously observed with ultracold atoms. Similar second-order processes form the basis of superexchange interactions between atoms on neighbouring lattice sites of a periodic potential, a central component of proposals for realizing quantum magnetism.

I am highlighting this paper to show how difficult it is to get whole atoms to tunnel through a barrier. We constantly get questions (and hypothesis) about things like tennis balls or even a person tunneling through walls, under the pretense that since tunneling phenomena is real for single particles, then in principle, whole macroscopic objects can as well. This is a fallacy.

The requirement and the complications for whole objects to undergo quantum tunneling are astounding. Requiring that each part of the atom or each part of the object be in total coherence with each other for the whole thing to tunneling through is one almost-impossible barrier (no pun intended). As can be seen just from this experiment, other effects that are not present or not significant in single-particle tunneling will start to creep up. The nature of the barrier and what is embedded in it will play a larger role in such tunneling process. It isn't easy nor obvious that such macro object tunneling can be done. It is already this difficult for simple atoms that, in the scale of things, can be easily made to be in coherent with all of the parts within it. The same cannot be said with a tennis ball.

Zz.
 
Physics news on Phys.org
  • #52
V. Parigi et al., "Probing Quantum Commutation Rules by Addition and Subtraction of Single Photons to/from a Light Field", Science v.317, p.1890 (2007).

Abstract: The possibility of arbitrarily "adding" and "subtracting" single photons to and from a light field may give access to a complete engineering of quantum states and to fundamental quantum phenomena. We experimentally implemented simple alternated sequences of photon creation and annihilation on a thermal field and used quantum tomography to verify the peculiar character of the resulting light states. In particular, as the final states depend on the order in which the two actions are performed, we directly observed the noncommutativity of the creation and annihilation operators, one of the cardinal concepts of quantum mechanics, at the basis of the quantum behavior of light. These results represent a step toward the full quantum control of a field and may provide new resources for quantum information protocols.

Read also the Perspective on this paper by R. Boyd et al. in the same issue of the journal. In that Perspective, the description of what has been accomplished can be summed up in these 2 paragraphs:

In an intriguing and illustrative report on page 1890 of this issue, Parigi et al. (3) present the results of a laboratory demonstration of what happens in the quantum mechanical operations of photon creation and annihilation, which lacks commutativity. These authors add a single photon to a light beam, which corresponds to the action of the standard quantum mechanical creation operator â. They can also subtract a single photon from the light beam, which corresponds to the annihilation operator a.

Parigi et al. measure the quantum mechanical state of a thermal light field after performing these two operations on it, and they show that the final state depends on the order in which the operations are performed. This result is a striking confirmation of the lack of commutativity of quantum mechanical operators. Moreover, the authors present the strongly counterintuitive result that, under certain conditions, the removal of a photon from a light field can lead to an increase in the mean number of photons in that light field, as predicted earlier.

Zz.
 
  • #53
A. L. Cavalieri et al., "Attosecond spectroscopy in condensed matter", Nature v.449, p.1029 (2007).

Abstract: Comprehensive knowledge of the dynamic behaviour of electrons in condensed-matter systems is pertinent to the development of many modern technologies, such as semiconductor and molecular electronics, optoelectronics, information processing and photovoltaics. Yet it remains challenging to probe electronic processes, many of which take place in the attosecond (1 as = 10-18 s) regime. In contrast, atomic motion occurs on the femtosecond (1 fs = 10-15 s) timescale and has been mapped in solids in real time using femtosecond X-ray sources. Here we extend the attosecond techniques previously used to study isolated atoms in the gas phase to observe electron motion in condensed-matter systems and on surfaces in real time. We demonstrate our ability to obtain direct time-domain access to charge dynamics with attosecond resolution by probing photoelectron emission from single-crystal tungsten. Our data reveal a delay of approximately 100 attoseconds between the emission of photoelectrons that originate from localized core states of the metal, and those that are freed from delocalized conduction-band states. These results illustrate that attosecond metrology constitutes a powerful tool for exploring not only gas-phase systems, but also fundamental electronic processes occurring on the attosecond timescale in condensed-matter systems and on surfaces.

Please read the News and Views article on this work in the same issue of Nature, and the http://physicsworld.com/cws/article/news/31566.

Zz.
 
  • #54
M. König et al, "Quantum Spin Hall Insulator State in HgTe Quantum Wells", Science v.318, p.766 (2007)

Abstract: Recent theory predicted that the quantum spin Hall effect, a fundamentally new quantum state of matter that exists at zero external magnetic field, may be realized in HgTe/(Hg,Cd)Te quantum wells. We fabricated such sample structures with low density and high mobility in which we could tune, through an external gate voltage, the carrier conduction from n-type to p-type, passing through an insulating regime. For thin quantum wells with well width d < 6.3 nanometers, the insulating regime showed the conventional behavior of vanishingly small conductance at low temperature. However, for thicker quantum wells (d > 6.3 nanometers), the nominally insulating regime showed a plateau of residual conductance close to 2e2/h, where e is the electron charge and h is Planck's constant. The residual conductance was independent of the sample width, indicating that it is caused by edge states. Furthermore, the residual conductance was destroyed by a small external magnetic field. The quantum phase transition at the critical thickness, d = 6.3 nanometers, was also independently determined from the magnetic field–induced insulator-to-metal transition. These observations provide experimental evidence of the quantum spin Hall effect.

Read also the Perspective on this paper by Nagaosa in the same issue of Science. This would be a very strong evidence for the quantum spin hall effect.

Zz.
 
  • #55
D. Akoury et al., "The Simplest Double Slit: Interference and Entanglement in Double Photoionization of H2", Science v.318, p.949 (2007)

Abstract: The wave nature of particles is rarely observed, in part because of their very short de Broglie wavelengths in most situations. However, even with wavelengths close to the size of their surroundings, the particles couple to their environment (for example, by gravity, Coulomb interaction, or thermal radiation). These couplings shift the wave phases, often in an uncontrolled way, and the resulting decoherence, or loss of phase integrity, is thought to be a main cause of the transition from quantum to classical behavior. How much interaction is needed to induce this transition? Here we show that a photoelectron and two protons form a minimum particle/slit system and that a single additional electron constitutes a minimum environment. Interference fringes observed in the angular distribution of a single electron are lost through its Coulomb interaction with a second electron, though the correlated momenta of the entangled electron pair continue to exhibit quantum interference.

Also see review of this work at the http://physicsworld.com/cws/article/news/31763;jsessionid=588A14F90F3040EC50539DE92CD8249A (free registration required) and at PhysOrg.

Zz.
 
Last edited:
  • #56
L. Ozyuzer et al., "Emission of Coherent THz Radiation from Superconductors", Science v.318, p.1291 (2007).

Abstract: Compact solid-state sources of terahertz (THz) radiation are being sought for sensing, imaging, and spectroscopy applications across the physical and biological sciences. We demonstrate that coherent continuous-wave THz radiation of sizable power can be extracted from intrinsic Josephson junctions in the layered high-temperature superconductor Bi2Sr2CaCu2O8. In analogy to a laser cavity, the excitation of an electromagnetic cavity resonance inside the sample generates a macroscopic coherent state in which a large number of junctions are synchronized to oscillate in phase. The emission power is found to increase as the square of the number of junctions reaching values of 0.5 microwatt at frequencies up to 0.85 THz, and persists up to ~50 kelvin. These results should stimulate the development of superconducting compact sources of THz radiation.

Also read the Perspective of this work in the same issue of Science. A PhysicsWorld review of this work http://physicsworld.com/cws/article/news/31957;jsessionid=909ED81C8AAAF5912FD9C1CED4C2FA94" .

I know quite a bit regarding this work, since I've done tunneling spectroscopy on superconductors and that I've worked with this Bi compound. Not only that, I personally know 2 of the authors in this paper, including the lead author. In fact, I believe I was in the lab when they were doing this work (yes, I'm nosy and tend to stick my nose into people's lab, if they let me). So I'm terribly happy that they've managed to publish in Science and get quite a bit of publicity regarding this work.

Secondly, note that this is simply another one in a long line of examples where something that appears to be esoteric and purely "physics" such as quantum tunneling, Josephson current, and the physics of superconductivity can produce a clear useful application. This is still physics, not engineering. Yet, there is a clear application of a physics principle at work here.

Zz.
 
Last edited by a moderator:
  • #57
T. Paterek et al. "Experimental Test of Non-Local Realistic Theories Without The Rotational Symmetry Assumption", Phys. Rev. Lett. 99, 210406 (2007).

Abstract: We analyze the class of nonlocal realistic theories that was originally considered by Leggett [Found. Phys. 33, 1469 (2003)] and tested by us in a recent experiment [Nature (London) 446, 871 (2007)]. We derive an incompatibility theorem that works for finite numbers of polarizer settings and that does not require the previously assumed rotational symmetry of the two-particle correlation functions. The experimentally measured case involves seven different measurement settings. Using polarization-entangled photon pairs, we exclude this broader class of nonlocal realistic models by experimentally violating a new Leggett-type inequality by 80 standard deviations.

This appears to a be a follow up to their earlier Nature paper which https://www.physicsforums.com/showpost.php?p=1307660&postcount=40" in this thread. They claim to have excluded even a larger class of non-local realistic model.

Interestingly enough, there is ANOTHER paper right after this that presents a similar report of the violation of the Leggett inequality.

Cyril Branciard et al. " Experimental Falsification of Leggett's Nonlocal Variable Model", Phys. Rev. Lett. 99, 210407 (2007).

Abstract: Bell's theorem guarantees that no model based on local variables can reproduce quantum correlations. Also, some models based on nonlocal variables, if subject to apparently “reasonable” constraints, may fail to reproduce quantum physics. In this Letter, we introduce a family of inequalities, which use a finite number of measurement settings, and which therefore allow testing Leggett's nonlocal model versus quantum physics. Our experimental data falsify Leggett's model and are in agreement with quantum predictions.

When you have two different experiments done by two independent group coming up with the same conclusion, it makes for a very convincing argument for the validity of such a conclusion.

Zz.
 
Last edited by a moderator:
  • #58
Johannes Kofler and Časlav Brukner, " Classical World Arising out of Quantum Physics under the Restriction of Coarse-Grained Measurements", Phys. Rev. Lett. 99, 180403 (2007).

Abstract: Conceptually different from the decoherence program, we present a novel theoretical approach to macroscopic realism and classical physics within quantum theory. It focuses on the limits of observability of quantum effects of macroscopic objects, i.e., on the required precision of our measurement apparatuses such that quantum phenomena can still be observed. First, we demonstrate that for unrestricted measurement accuracy, no classical description is possible for arbitrarily large systems. Then we show for a certain time evolution that under coarse-grained measurements, not only macrorealism but even classical Newtonian laws emerge out of the Schrödinger equation and the projection postulate.

A http://www.nature.com/news/2007/071122/full/news.2007.277.html" can be found in Nature's Daily Science news (link may be restricted or open only for a limited time). This is an interesting and important work because they are trying to show the quantum to classical "transition" via a different approach then the standard decoherence scenario. Essentially, the coarse-grained measurement that we make causes the classical world to emerge. If we make our measurement more precise for a system that has a large number of particles, then we should start detecting random "jumps" in the system that signify the emergence of the quantum world.

So let's see some clever experimentalist design an experiment to verify this. :)

Zz.
 
Last edited by a moderator:
  • #59
This is not entirely new research as such, but today's review article in Nature is still noteworthy as a review.

P. Monthoux, D. Pines and G. G. Lonzarich, Nature 450, 1177 (2007)

Abstract: The idea of superconductivity without the mediating role of lattice vibrations (phonons) has a long history. It was realized soon after the publication of the Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity 50 years ago that a full treatment of both the charge and spin degrees of freedom of the electron predicts the existence of attractive components of the effective interaction between electrons even in the absence of lattice vibrations—a particular example is the effective interaction that depends on the relative spins of the electrons. Such attraction without phonons can lead to electronic pairing and to unconventional forms of superconductivity that can be much more sensitive than traditional (BCS) superconductivity to the precise details of the crystal structure and to the electronic and magnetic properties of a material.

The paper discusses superconductivity may be found near the onset of a magnetically (or the analogously with charge density) ordered state. The phase space for an effectively attractive e-e interaction is shown to be much wider than that traditionally assumed from phonon mediated interactions.

http://www.nature.com/nature/journal/v450/n7173/pdf/nature06480.pdf

http://www.eurekalert.org/pub_releases/2007-12/danl-tqf122007.php
 
Last edited by a moderator:
  • #60
C. Hertlein et al, "Direct measurement of critical Casimir forces", Nature v.451, p.172 (2008).

Abstract: When fluctuating fields are confined between two surfaces, long-range forces arise. A famous example is the quantum-electrodynamical Casimir force that results from zero-point vacuum fluctuations confined between two conducting metal plates. A thermodynamic analogue is the critical Casimir force: it acts between surfaces immersed in a binary liquid mixture close to its critical point and arises from the confinement of concentration fluctuations within the thin film of fluid separating the surfaces. So far, all experimental evidence for the existence of this effect has been indirect. Here we report the direct measurement of critical Casimir force between a single colloidal sphere and a flat silica surface immersed in a mixture of water and 2,6-lutidine near its critical point. We use total internal reflection microscopy to determine in situ the forces between the sphere and the surface, with femtoNewton resolution6. Depending on whether the adsorption preferences of the sphere and the surface for water and 2,6-lutidine are identical or opposite, we measure attractive and repulsive forces, respectively, that agree quantitatively with theoretical predictions and exhibit exquisite dependence on the temperature of the system. We expect that these features of critical Casimir forces may result in novel uses of colloids as model systems.

Also read the News and Views article on this work in the same issue of Nature, and a http://physicsworld.com/cws/article/news/32380" .

Zz.
 
Last edited by a moderator:
  • #61
I. Ferreras et al., "Necessity of Dark Matter in Modified Newtonian Dynamics within Galactic Scales", Phys. Rev. Lett. v.100, p.031302 (2008).

Abstract: To test modified Newtonian dynamics (MOND) on galactic scales, we study six strong gravitational lensing early-type galaxies from the CASTLES sample. Comparing the total mass (from lensing) with the stellar mass content (from a comparison of photometry and stellar population synthesis), we conclude that strong gravitational lensing on galactic scales requires a significant amount of dark matter, even within MOND. On such scales a 2 eV neutrino cannot explain the excess of matter in contrast with recent claims to explain the lensing data of the bullet cluster. The presence of dark matter is detected in regions with a higher acceleration than the characteristic MOND scale of ~10^-10 m/s^2. This is a serious challenge to MOND unless lensing is qualitatively different [possibly to be developed within a covariant, such as Tensor-Vector-Scalar (TeVeS), theory]

With the Bullet cluster evidence, and now this, could MOND be in serious trouble now?

Zz.
 
  • #62
A.J. Leggett, "Realism and the physical world", Rep. Prog. Phys. v.71, p.022001 (2008)

Abstract: I consider the extent to which the applicability of the concept of classical realism is constrained, irrespective of the validity or not of the quantum formalism, by existing experiments both in the EPR–Bell setup, including recent experiments testing 'nonlocal realistic' theories, and in the area of 'macroscopic quantum coherence'. Unless we are willing to sacrifice one or more other intuitively plausible notions such as that of the conventional 'arrow of time', it appears impossible, in either context, to maintain the classical notion of realism.

Zz.
 
  • #63
A. Caprez et al., "A macroscopic test of the Aharonov-Bohm effect", Phys. Rev. Lett., v99, p.210401 (2007).

Abstract: The Aharonov-Bohm (AB) effect is a purely quantum mechanical effect. The original (classified as Type-I) AB-phase shift exists in experimental conditions where the electromagnetic fields and forces are zero. It is the absence of forces that makes the AB-effect entirely quantum mechanical. Although the AB-phase shift has been demonstrated unambiguously, the absence of forces in Type-I AB-effects has never been shown. Here, we report the observation of the absence of time delays associated with forces of the magnitude needed to explain the AB-phase shift for a macroscopic system.

Also see the http://arxiv.org/abs/0708.2428" .

A Perspective on this work can also be found in March 20, 2008 issue of Nature (Nature, v.452, p.298 (2008)).

Looks like the AB effect is non-local after all!

Zz.
 
Last edited by a moderator:
  • #64
A.N. Pasupathy et al., "Electronic Origin of the Inhomogeneous Pairing Interaction in the High-Tc Superconductor Bi2Sr2CaCu2O8+{delta}", Science v.320, p.196 (2008).

Abstract: Identifying the mechanism of superconductivity in the high-temperature cuprate superconductors is one of the major outstanding problems in physics. We report local measurements of the onset of superconducting pairing in the high–transition temperature (Tc) superconductor Bi2Sr2CaCu2O8+{delta} using a lattice-tracking spectroscopy technique with a scanning tunneling microscope. We can determine the temperature dependence of the pairing energy gaps, the electronic excitations in the absence of pairing, and the effect of the local coupling of electrons to bosonic excitations. Our measurements reveal that the strength of pairing is determined by the unusual electronic excitations of the normal state, suggesting that strong electron-electron interactions rather than low-energy (<0.1 volts) electron-boson interactions are responsible for superconductivity in the cuprates.

A http://www.sciencedaily.com/releases/2008/04/080410140538.htm" can be found on ScienceDaily. So the cuprates may not have a "glue" that is responsible for the superconducting mechanism? Oh my! Phil Anderson might be right after all! :)

Zz.
 
Last edited by a moderator:
  • #65
E.V. Linder, "Mapping the cosmological expansion", Rep. Prog. Phys. v.71, p.056901 (2008).

Abstract: The ability to map the cosmological expansion has developed enormously, spurred by the turning point one decade ago of the discovery of cosmic acceleration. The standard model of cosmology has shifted from a matter dominated, standard gravity, decelerating expansion to the present search for the origin of acceleration in the cosmic expansion. We present a wide ranging review of the tools, challenges and physical interpretations. The tools include direct measures of cosmic scales through Type Ia supernova luminosity distances, and angular distance scales of baryon acoustic oscillation and cosmic microwave background density perturbations, as well as indirect probes such as the effect of cosmic expansion on the growth of matter density fluctuations. Accurate mapping of the expansion requires understanding of systematic uncertainties in both the measurements and the theoretical framework, but the result will give important clues to the nature of the physics behind accelerating expansion and to the fate of the universe.

And excellent review source, especially if you're interested in how various results in cosmology are obtained.

Zz.
 
  • #66
D. N. Matsukevich et al., "Bell Inequality Violation with Two Remote Atomic Qubits", Phys. Rev. Lett. v.100, p.150404 (2008).

Abstract: We observe violation of a Bell inequality between the quantum states of two remote Yb+ ions separated by a distance of about 1 m with the detection loophole closed. The heralded entanglement of two ions is established via interference and joint detection of two emitted photons, whose polarization is entangled with each ion. The entanglement of remote qubits is also characterized by full quantum state tomography.

Could we be on a clear path for a loophole-free Bell-type experiment? This report certainly is providing a convincing evidence that we are well on our way!

Zz.
 
  • #67
M.T. Murphy et al. "Strong Limit on a Variable Proton-to-Electron Mass Ratio from Molecules in the Distant Universe", Science v. 320, p. 1611 (2008).

Abstract: The Standard Model of particle physics assumes that the so-called fundamental constants are universal and unchanging. Absorption lines arising in molecular clouds along quasar sightlines offer a precise test for variations in the proton-to-electron mass ratio, µ, over cosmological time and distance scales. The inversion transitions of ammonia are particularly sensitive to µ as compared to molecular rotational transitions. Comparing the available ammonia spectra observed toward the quasar B0218+357 with new, high-quality rotational spectra, we present the first detailed measurement of µ with this technique, limiting relative deviations from the laboratory value to |{Delta}µ/µ| < 1.8 x 10–6 (95% confidence level) at approximately half the universe's current age—the strongest astrophysical constraint to date. Higher-quality ammonia observations will reduce both the statistical and systematic uncertainties in these observations.

In other words, even as far back as half of the universe's age, this ratio of the mass of the proton to the mass of electron, hasn't changed up to the accuracy limit of this measurement.

Zz.
 
  • #68
L. Li et al., "Phase Transitions of Dirac Electrons in Bismuth", Science v.321, p.547 (2008).

Abstract: The Dirac Hamiltonian, which successfully describes relativistic fermions, applies equally well to electrons in solids with linear energy dispersion, for example, in bismuth and graphene. A characteristic of these materials is that a magnetic field less than 10 tesla suffices to force the Dirac electrons into the lowest Landau level, with resultant strong enhancement of the Coulomb interaction energy. Moreover, the Dirac electrons usually come with multiple flavors or valley degeneracy. These ingredients favor transitions to a collective state with novel quantum properties in large field. By using torque magnetometry, we have investigated the magnetization of bismuth to fields of 31 tesla. We report the observation of sharp field-induced phase transitions into a state with striking magnetic anisotropy, consistent with the breaking of the threefold valley degeneracy.

Read a http://www.sciencedaily.com/releases/2008/07/080725152314.htm".

This is another example where relativistic equations need not require some esoteric conditions to be applicable. Some of them can be found in the very material that we use in our electronics.

Zz.
 
Last edited by a moderator:
  • #69
Kamimara, Y. et al. "Iron-based superconductor LaO1-xFxFeAs (x=0.05-0.12) with Tc=26 K". J. Am. Chem. Soc. 130, 3296 (2008).

Abstract: We report that a layered iron-based compound LaOFeAs undergoes superconducting transition under doping with F- ions at the O2- site. The transition temperature (Tc) exhibits a trapezoid shape dependence on the F- content, with the highest Tc of ~26 K at ~11 atom %.

Full paper available here: http://pubs.acs.org/cgi-bin/abstract.cgi/jacsat/2008/130/i11/abs/ja800073m.html
For further developments, see:
Science Daily
http://www.natureasia.com/asia-materials/highlight.php?id=222

Related follow up:
Chen, X.H. et al. "Superconductivity at 43 K in SmFeAsO1-xFx", Nature 453, 761 (2008).This is a first observation of high Tc behavior outside of cuprate systems.
 
Last edited by a moderator:
  • #70
L.W. Martin, et al., "Electric field control of ferromagnetism using a magnetoelectric multiferroic," Nature Mater. 7, 478 (2008)

Abstract: Multiferroics are of interest for memory and logic device applications, as the coupling between ferroelectric and magnetic properties enables the dynamic interaction between these order parameters. Here, we report an approach to control and switch local ferromagnetism with an electric field using multiferroics. We use two types of electromagnetic coupling phenomenon that are manifested in heterostructures consisting of a ferromagnet in intimate contact with the multiferroic BiFeO3. The first is an internal, magnetoelectric coupling between antiferromagnetism and ferroelectricity in the BiFeO3 film that leads to electric-field control of the antiferromagnetic order. The second is based on exchange interactions at the interface between a ferromagnet (Co0.9Fe0.1) and the antiferromagnet. We have discovered a one-to-one mapping of the ferroelectric and ferromagnetic domains, mediated by the colinear coupling between the magnetization in the ferromagnet and the projection of the antiferromagnetic order in the multiferroic. Our preliminary experiments reveal the possibility to locally control ferromagnetism with an electric field.

Full text and summary available here: http://www-als.lbl.gov/als/science/sci_archive/171magnetism.html

For a review on multiferroics, see: Ying-Hao Chu et al, "Controlling magnetism with multiferroics", Materials Today 10, 16 (2007) http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6X1J-4PND5YK-S&_user=10&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_version=1&_urlVersion=0&_userid=10&md5=0a7fb2548257eb459e194e1903854b23

The ability to control ferromagnetism using electric fields has huge potential in the area of GMR based memory storage devices.
 
Last edited by a moderator:
  • #71
D. Salart et al., "Testing the speed of 'spooky action at a distance'", Nature v.454, p.861 (2008).

Abstract: Correlations are generally described by one of two mechanisms: either a first event influences a second one by sending information encoded in bosons or other physical carriers, or the correlated events have some common causes in their shared history. Quantum physics predicts an entirely different kind of cause for some correlations, named entanglement. This reveals itself in correlations that violate Bell inequalities (implying that they cannot be described by common causes) between space-like separated events (implying that they cannot be described by classical communication). Many Bell tests have been performed, and loopholes related to locality and detection have been closed in several independent experiments. It is still possible that a first event could influence a second, but the speed of this hypothetical influence (Einstein's 'spooky action at a distance') would need to be defined in some universal privileged reference frame and be greater than the speed of light. Here we put stringent experimental bounds on the speed of all such hypothetical influences. We performed a Bell test over more than 24 hours between two villages separated by 18 km and approximately east–west oriented, with the source located precisely in the middle. We continuously observed two-photon interferences well above the Bell inequality threshold. Taking advantage of the Earth's rotation, the configuration of our experiment allowed us to determine, for any hypothetically privileged frame, a lower bound for the speed of the influence. For example, if such a privileged reference frame exists and is such that the Earth's speed in this frame is less than 10-3 times that of the speed of light, then the speed of the influence would have to exceed that of light by at least four orders of magnitude.

Also read the News and Views article in the same issue.

Edit: http://physicsworld.com/cws/article/news/35404" can also be found at PhysicsWorld.

Zz.
 
Last edited by a moderator:
  • #72
FUNKER said:
just want to say I fully AGREE! :!)


yes yes :))))
 
  • #73
C.G. Camara et al. "Correlation between nanosecond X-ray flashes and stick–slip friction in peeling tape", Nature v.455, p.1089 (2008).

Abstract: Relative motion between two contacting surfaces can produce visible light, called triboluminescence. This concentration of diffuse mechanical energy into electromagnetic radiation has previously been observed to extend even to X-ray energies. Here we report that peeling common adhesive tape in a moderate vacuum produces radio and visible emission along with nanosecond, 100-mW X-ray pulses that are correlated with stick–slip peeling events. For the observed 15-keV peak in X-ray energy, various models give a competing picture of the discharge process, with the length of the gap between the separating faces of the tape being 30 or 300 mum at the moment of emission. The intensity of X-ray triboluminescence allowed us to use it as a source for X-ray imaging. The limits on energies and flash widths that can be achieved are beyond current theories of tribology.

This thing has been getting a lot of popular media coverage because the simple act of peeling an ordinary scotch tape in moderate vacuum can actually generate a small amount of short x-ray burst.

Zz.
 
  • #74
Event-by-Event Simulation of Einstein-Podolsky-Rosen-Bohm Experiments:

http://www.springerlink.com/content/p28v88867w7213mu/ Open Access
http://arxiv.org/pdf/0712.3693

Abstract We construct an event-based computer simulation model of the Einstein-Podolsky-Rosen-Bohm experiments with photons. The algorithm is a one-to-one copy of the data gathering and analysis procedures used in real laboratory experiments. We consider two types of experiments, those with a source emitting photons with opposite but otherwise unpredictable polarization and those with a source emitting photons with fixed polarization. In the simulation, the choice of the direction of polarization measurement for each detection event is arbitrary. We use three different procedures to identify pairs of photons and compute the frequency of coincidences by analyzing experimental data and simulation data. The model strictly satisfies Einstein’s criteria of local causality, does not rely on any concept of quantum theory and reproduces the results of quantum theory for both types of experiments. We give a rigorous proof that the probabilistic description of the simulation model yields the quantum theoretical expressions for the single- and two-particle expectation values.​
 
  • #75
A. Fragner et al. "Resolving Vacuum Fluctuations in an Electrical Circuit by Measuring the Lamb Shift", Science v.322, p.1357 (2008).

Abstract: Quantum theory predicts that empty space is not truly empty. Even in the absence of any particles or radiation, in pure vacuum, virtual particles are constantly created and annihilated. In an electromagnetic field, the presence of virtual photons manifests itself as a small renormalization of the energy of a quantum system, known as the Lamb shift. We present an experimental observation of the Lamb shift in a solid-state system. The strong dispersive coupling of a superconducting electronic circuit acting as a quantum bit (qubit) to the vacuum field in a transmission-line resonator leads to measurable Lamb shifts of up to 1.4% of the qubit transition frequency. The qubit is also observed to couple more strongly to the vacuum field than to a single photon inside the cavity, an effect that is explained by taking into account the limited anharmonicity of the higher excited qubit states.

An amazing feat to detect a Lamb shift in a many-body system such an a superconducting electronic circuit.

Zz.
 
  • #76
J. S. Lundeen and A. M. Steinberg, "Experimental Joint Weak Measurement on a Photon Pair as a Probe of Hardy's Paradox", Phys. Rev. Lett. 102, 020404 (2009).

Abstract: It has been proposed that the ability to perform joint weak measurements on postselected systems would allow us to study quantum paradoxes. These measurements can investigate the history of those particles that contribute to the paradoxical outcome. Here we experimentally perform weak measurements of joint (i.e., nonlocal) observables. In an implementation of Hardy's paradox, we weakly measure the locations of two photons, the subject of the conflicting statements behind the paradox. Remarkably, the resulting weak probabilities verify all of these statements but, at the same time, resolve the paradox.

This experiment appears to be the confirmation and resolution of the Hardy's paradox.

A news article on this http://exchangemagazine.com/morningpost/2009/week3/Friday/0116014.htm" .

Zz.
 
Last edited by a moderator:
  • #77
A. Cabello et al., "Proposed Bell Experiment with Genuine Energy-Time Entanglement", Phys. Rev. Lett. v.102, p.040401 (2009).

Abstract: Franson's Bell experiment with energy-time entanglement [Phys. Rev. Lett. 62, 2205 (1989)] does not rule out all local hidden variable models. This defect can be exploited to compromise the security of Bell inequality-based quantum cryptography. We introduce a novel Bell experiment using genuine energy-time entanglement, based on a novel interferometer, which rules out all local hidden variable models. The scheme is feasible with actual technology.

Zz.
 
  • #78
this is a recent paper addressing the origin of spin glass in hole-doped cuprate superconductors. The author attempts a new mechanism for spin glass that can live with Zhang-Rice singlet states. The paper is located at

J. Phys.: Condens. Matter 21 (2009) 075702

Abstract: To address the incompatibility of Zhang–Rice singlet formation and the observed spin glass behavior, an effective model is proposed for the electronic behavior of cuprate materials. The model includes an antiferromagnetic interaction between the spin of the hole in a Zhang–Rice orbital and the spin of the hole on the corresponding copper site. While in the large interaction limit this recovers the t–J model, in the low energy limit the Zhang–Rice singlets are deformed. It is also shown that such deformation can induce random defect ferromagnetic (FM) bonds between adjacent local spins, an effect herein referred to as unusual double exchange, and then spin glass behavior shall result in the case of localized holes. A derivation of the model is also presented.
 
  • #79
V. Moshchalkov et al., "Type-1.5 Superconductivity" Phys. Rev. Lett. 102, 117001 (2009)

Abstract: We demonstrate the existence of a novel superconducting state in high quality two-component MgB2 single crystalline superconductors where a unique combination of both type-1 (lambda1/xi1<1/sqrt(2)) and type-2 (lambda2/xi2>1/sqrt(2)) superconductor conditions is realized for the two components of the order parameter. This condition leads to a vortex-vortex interaction attractive at long distances and repulsive at short distances, which stabilizes unconventional stripe- and gossamerlike vortex patterns that we have visualized in this type-1.5 superconductor using Bitter decoration and also reproduced in numerical simulations.

If this is true, they have found a new phase of superconductivity where both Type I and Type II properties resides in the same material (but in different bands).

You may also read a http://physics.aps.org/articles/v2/22" AND, get a free copy of the exact paper.

Zz.
 
Last edited by a moderator:
  • #80
D. Gross, S.T. Flammia and J. Eisert, "Most Quantum States Are Too Entangled To Be Useful As Computational Resources" Phys. Rev. Lett. 102, 190501 (2009)

Abstract: It is often argued that entanglement is at the root of the speedup for quantum compared to classical computation, and that one needs a sufficient amount of entanglement for this speedup to be manifest. In measurement-based quantum computing, the need for a highly entangled initial state is particularly obvious. Defying this intuition, we show that quantum states can be too entangled to be useful for the purpose of computation, in that high values of the geometric measure of entanglement preclude states from offering a universal quantum computational speedup. We prove that this phenomenon occurs for a dramatic majority of all states: the fraction of useful n-qubit pure states is less than exp(-n2). This work highlights a new aspect of the role entanglement plays for quantum computational speedups.

Quantum computers are still far from realization. Usually fast decoherence and the problem of producing high degrees of entanglement between large numbers of qubits are mentioned as the first big problems, which one thinks of. Now Gross et al. show that most highly entangled quantum states will not provide a significant increase in computational speed compared to classical computers. So it might be necessary in future to identify and understand the few remaining entangled states, which are indeed useful for computation.

There is also an accompanying viewpoint to this article: http://link.aip.org/link/?&l_creator=getabs-normal&l_dir=REV&l_rel=VIEWPOINT&from_key=PRLTAO000102000019190501000001&from_keyType=CVIPS&from_loc=AIP&to_j=PHYSGM&to_v=2&to_p=38&to_loc=APS&to_url=http%3A%2F%2Flink.aps.org%2Fdoi%2F10.1103%2FPhysics.2.38
 
Last edited by a moderator:
  • #81
M. Gu et al. "http://arxiv.org/abs/0809.0151" ", Physica D: Nonlinear Phenomena, v.238, p.835 (2009).

Abstract: In 1972, P.W. Anderson suggested that ‘More is Different’, meaning that complex physical systems may exhibit behavior that cannot be understood only in terms of the laws governing their microscopic constituents. We strengthen this claim by proving that many macroscopic observable properties of a simple class of physical systems (the infinite periodic Ising lattice) cannot in general be derived from a microscopic description. This provides evidence that emergent behavior occurs in such systems, and indicates that even if a ‘theory of everything’ governing all microscopic interactions were discovered, the understanding of macroscopic order is likely to require additional insights.

Read the News and Views article on this paper in Nature 459, 332-334 (21 May 2009).

Edit: read the http://arxiv.org/abs/0809.0151" here.

Zz.
 
Last edited by a moderator:
  • #82
A. V. Ponomarev et al., "ac-Driven Atomic Quantum Motor", Phys. Rev. Lett. v.102, p.230601 (2009) .

Abstract: We propose an ac-driven quantum motor consisting of two different, interacting ultracold atoms placed into a ring-shaped optical lattice and submerged in a pulsating magnetic field. While the first atom carries a current, the second one serves as a quantum starter. For fixed zero-momentum initial conditions the asymptotic carrier velocity converges to a unique nonzero value. We also demonstrate that this quantum motor performs work against a constant load.

A review of this paper can also be found at the http://sciencenow.sciencemag.org/cgi/content/full/2009/609/1".

Zz.
 
Last edited by a moderator:
  • #83
R. Horodecki et al., "Quantum Entanglement", Rev. Mod. Phys. v.81, p865 (2009).

Abstract: From the point of view of quantum information science, entanglement is a resource that can be used to perform tasks that are impossible in a classical world. In a certain sense, the more entanglement we have, the better we can perform those tasks. Thus, one of the main goals in this field has been to identify under which conditions two or more systems are entangled, and how entangled they are. This paper reviews the main criteria to detect entanglement as well as entanglement measures and also discusses the role of entanglement in quantum communication and cryptography.

This is a HUGE, 78-page review of quantum entanglement. We get a lot of frequent questions on this topic, so it is appropriate to post a source that has a wealth of information and references.

The Arxiv version of this paper http://arxiv.org/abs/quant-ph/0702225" .

Zz.
 
Last edited by a moderator:
  • #84
M. Karski et al., "Quantum Walk in Position Space with Single Optically Trapped Atoms", Science v.325, p. 174 (2009).

Abstract: The quantum walk is the quantum analog of the well-known random walk, which forms the basis for models and applications in many realms of science. Its properties are markedly different from the classical counterpart and might lead to extensive applications in quantum information science. In our experiment, we implemented a quantum walk on the line with single neutral atoms by deterministically delocalizing them over the sites of a one-dimensional spin-dependent optical lattice. With the use of site-resolved fluorescence imaging, the final wave function is characterized by local quantum state tomography, and its spatial coherence is demonstrated. Our system allows the observation of the quantum-to-classical transition and paves the way for applications, such as quantum cellular automata.

Read the http://sciencenow.sciencemag.org/cgi/content/full/2009/710/2".

Zz.
 
Last edited by a moderator:
  • #85
M. Aßmann et. al., "Higher-Order Photon Bunching in a Semiconductor Microcavity", Science v.325, p.297 (2009).

Abstract: Quantum mechanically indistinguishable particles such as photons may show collective behavior. Therefore, an appropriate description of a light field must consider the properties of an assembly of photons instead of independent particles. We have studied multiphoton correlations up to fourth order in the single-mode emission of a semiconductor microcavity in the weak and strong coupling regimes. The counting statistics of single photons were recorded with picosecond time resolution, allowing quantitative measurement of the few-photon bunching inside light pulses. Our results show bunching behavior in the strong coupling case, which vanishes in the weak coupling regime as the cavity starts lasing. In particular, we verify the n factorial prediction for the zero-delay correlation function of n thermal light photons.

The bunching and anti-bunching phenomena are considered to be THE strongest evidence for photons. These have no classical equivalence.

Zz.
 
  • #86
G. Kirchmair et al., "State-independent experimental test of quantum contextuality", Nature v.460, p.494 (2009).

Abstract: The question of whether quantum phenomena can be explained by classical models with hidden variables is the subject of a long-lasting debate. In 1964, Bell showed that certain types of classical models cannot explain the quantum mechanical predictions for specific states of distant particles, and some types of hidden variable models have been experimentally ruled out. An intuitive feature of classical models is non-contextuality: the property that any measurement has a value independent of other compatible measurements being carried out at the same time. However, a theorem derived by Kochen, Specker and Bell shows that non-contextuality is in conflict with quantum mechanics. The conflict resides in the structure of the theory and is independent of the properties of special states. It has been debated whether the Kochen–Specker theorem could be experimentally tested at al. First tests of quantum contextuality have been proposed only recently, and undertaken with photons and neutrons. But these tests required the generation of special quantum states and left various loopholes open. Here we perform an experiment with trapped ions that demonstrates a state-independent conflict with non-contextuality. The experiment is not subject to the detection loophole and we show that, despite imperfections and possible measurement disturbances, our results cannot be explained in non-contextual terms.

Zz.
 
  • #87
Y. Jompol et al., "Probing Spin-Charge Separation in a Tomonaga-Luttinger Liquid, Science v.325 p.597 (2009).

Abstract: In a one-dimensional (1D) system of interacting electrons, excitations of spin and charge travel at different speeds, according to the theory of a Tomonaga-Luttinger liquid (TLL) at low energies. However, the clear observation of this spin-charge separation is an ongoing challenge experimentally. We have fabricated an electrostatically gated 1D system in which we observe spin-charge separation and also the predicted power-law suppression of tunneling into the 1D system. The spin-charge separation persists even beyond the low-energy regime where the TLL approximation should hold. TLL effects should therefore also be important in similar, but shorter, electrostatically gated wires, where interaction effects are being studied extensively worldwide.

Just imagine - a charge carrier (say an electron), somehow behaves as if it's spin and its charge have been fractionalized, and thus, move differently. This is what spin-charge separation is. It is one of those fundamental phenomena in condensed matter physics that isn't observed anywhere else, but is something that could potentially be a fundamental principle in the physics of elementary particles.

Previous experiments have shown signatures of such spin-charge separation. It has been shown that the charge and thermal currents in 1D organic conductors violate the Wiedemann-Franz law, an indication of a possible spin-charge separation. The charge current had a different dispersion than the thermal currents, something you don't find in a standard Solid State Physics text.

In this new experiment, a different type of experiment was done - tunneling into a 1D system. There appears to be clear signatures of the spin-charge separation in the tunneling currents that were observed.

Zz.
 
  • #88
S. S. Hodgman et al., "Metastable Helium: A New Determination of the Longest Atomic Excited-State Lifetime", Phys. Rev. Lett. v.103, p.053002 (2009).

Abstract: Exited atoms may relax to the ground state by radiative decay, a process which is usually very fast (of order nanoseconds). However, quantum-mechanical selection rules can prevent such rapid decay, in which case these “metastable” states can have lifetimes of order seconds or longer. In this Letter, we determine experimentally the lifetime of the longest-lived neutral atomic state—the first excited state of helium (the 2 ^3S_1 metastable state)—to the highest accuracy yet measured. We use laser cooling and magnetic trapping to isolate a cloud of metastable helium (He*) atoms from their surrounding environment, and measure the decay rate to the ground 1 ^1S_0 state via extreme ultraviolet (XUV) photon emission. This is the first measurement using a virtually unperturbed ensemble of isolated helium atoms, and yields a value of 7870(510) seconds, in excellent agreement with the predictions of quantum electrodynamic theory.

Whoa! That's more than 2 hours!

Zz.
 
  • #89
Z. Bern et al., "Ultraviolet Behavior of N=8 Supergravity at Four Loops", Phys. Rev. Lett. 103, 081301 (2009).

Abstract: We describe the construction of the complete four-loop four-particle amplitude of N=8 supergravity. The amplitude is ultraviolet finite, not only in four dimensions, but in five dimensions as well. The observed extra cancellations provide additional nontrivial evidence that N=8 supergravity in four dimensions may be ultraviolet finite to all orders of perturbation theory.

Read a review of this work AND get http://physics.aps.org/articles/v2/70" .

Zz.
 
Last edited by a moderator:
  • #90
L. Maccone "Quantum Solution to the Arrow-of-Time Dilemma", Phys. Rev. Lett. 103, 080401 (2009).

Abstract: The arrow-of-time dilemma states that the laws of physics are invariant for time inversion, whereas the familiar phenomena we see everyday are not (i.e., entropy increases). I show that, within a quantum mechanical framework, all phenomena which leave a trail of information behind (and hence can be studied by physics) are those where entropy necessarily increases or remains constant. All phenomena where the entropy decreases must not leave any information of their having happened. This situation is completely indistinguishable from their not having happened at all. In the light of this observation, the second law of thermodynamics is reduced to a mere tautology: physics cannot study those processes where entropy has decreased, even if they were commonplace.

Read the Focus article on this paper here:

http://focus.aps.org/story/v24/st7

Zz.
 
  • #91
laurencn106 said:
D J Kapner et al. " Tests of the Gravitational Inverse-Square Law below the Dark-Energy Length Scale", Phys. Rev. Lett. 98 021101 (2007)

Abstract: We conducted three torsion-balance experiments to test the gravitational inverse-square law at separations between 9.53 mm and 55 µm, probing distances less than the dark-energy length scale lambdad=radical(radix(4)[h-bar]c/rho[sub d])[approximate]85 µm. We find with 95% confidence that the inverse-square law holds (|alpha|<=1) down to a length scale lambda=56 µm and that an extra dimension must have a size R<=44 µm.

Thanks for your contribution, but this paper was highlighted already 2 years ago here:

https://www.physicsforums.com/showpost.php?p=1301594&postcount=39

As per the "theme" of this thread, we try to highlight papers within the past year. If you are unsure if a paper has been highlighted here already, do a search on the thread on the first author's name.

Zz.
 
  • #92
S. Rao et al. "Measurement of Mechanical Forces Acting on Optically Trapped Dielectric Spheres Induced by Surface-Enhanced Raman Scattering, Phys. Rev. Lett. v.102, p.087401 (2009).

Abstract: Surface enhanced Raman scattering (SERS) is studied from optically trapped dielectric spheres partially covered with silver colloids in a solution with SERS active molecules. The Raman scattering and Brownian motion of the sphere are simultaneously measured to reveal correlations between the enhancement of the Raman signal and average position of the sphere. The correlations are due to the momenta transfer of the emitted Raman photons from the probe molecules. The addition of a mechanical force measurement provides a different dimension to the study of Raman processes.

You may also read the Physical Review http://focus.aps.org/story/v24/st12" .

Zz.
 
Last edited by a moderator:
  • #93
A. J. Bennett et al., "Interference of dissimilar photon sources, Nature Physics v.5 p.715-717 (2009).

Abstract: If identical photons meet at a semi-transparent mirror they seem to leave in the same direction, an effect called 'two-photon interference'. It has been known for some time that this effect should occur for photons generated by dissimilar sources with no common history, provided the measurement cannot distinguish between the photons. Here, we report a technique for observing such interference with isolated, unsynchronized sources for which the coherence times differ by several orders of magnitude. In an experiment we cause photons generated by different physical processes, with different photon statistics, to interfere. One of the sources is stimulated emission from a tunable laser, which has Poissonian statistics and a nanoelectronvolt bandwidth. The other is spontaneous emission from a quantum dot in a p–i–n diode with a few-microelectronvolt linewidth. We develop a theory to explain the visibility of interference, which is primarily limited by the timing resolution of our detectors.

It is well known that there is a close connection between indistinguishability and interference. Therefore recently there have been lots of efforts to test to which extent distinguishable photons can be made indistinguishable in terms of an experiment. This has been shown in several systems before, including single atoms, ions, consecutive single photons from single quantum dots and even different semiconductor nanostructures. Bennett et al. now prove that even photons from completely different light sources can show two-photon interference.
 
  • #94
is there any papers published at the high-school level?
 
  • #95
Quantum Zeno effect explains magnetic-sensitive radical-ion-pair reactions , Phys. Rev. E 80, 056115 (2009) - http://arxiv.org/abs/0806.0739

Abstract:Chemical reactions involving radical-ion pairs are ubiquitous in biology, since not only are they at the basis of the photosynthetic reaction chain, but are also assumed to underlie the biochemical magnetic compass used by avian species for navigation. Recent experiments with magnetic-sensitive radical-ion-pair reactions provided strong evidence for the radical-ion-pair magnetoreception mechanism, verifying the expected magnetic sensitivities and chemical product yield changes. It is here shown that the theoretical description of radical-ion-pair reactions used since the 70s cannot explain the observed data, because it is based on phenomenological equations masking quantum coherence effects. The fundamental density-matrix equation derived here from basic quantum measurement theory considerations naturally incorporates the quantum Zeno effect and readily explains recent experimental observations on low- and high magnetic-field radical-ion-pair reactions.
 
Last edited:
  • #96
R. Gerritsma et al. "Quantum simulation of the Dirac equation", Nature v.463, p.68 (2010) .

Abstract: The Dirac equation successfully merges quantum mechanics with special relativity. It provides a natural description of the electron spin, predicts the existence of antimatter and is able to reproduce accurately the spectrum of the hydrogen atom. The realm of the Dirac equation—relativistic quantum mechanics—is considered to be the natural transition to quantum field theory. However, the Dirac equation also predicts some peculiar effects, such as Klein’s paradox and ‘Zitterbewegung’, an unexpected quivering motion of a free relativistic quantum particle. These and other predicted phenomena are key fundamental examples for understanding relativistic quantum effects, but are difficult to observe in real particles. In recent years, there has been increased interest in simulations of relativistic quantum effects using different physical set-ups in which parameter tunability allows access to different physical regimes. Here we perform a proof-of-principle quantum simulation of the one-dimensional Dirac equation using a single trapped ion set to behave as a free relativistic quantum particle. We measure the particle position as a function of time and study Zitterbewegung for different initial superpositions of positive- and negative-energy spinor states, as well as the crossover from relativistic to non-relativistic dynamics. The high level of control of trapped-ion experimental parameters makes it possible to simulate textbook examples of relativistic quantum physics.

Zz.
 
  • #97
R.B. Lanyon et al., "Towards quantum chemistry on a quantum computer" Nature Chemistry v.2, p.106 (2009).

Abstract: Exact first-principles calculations of molecular properties are currently intractable because their computational cost grows exponentially with both the number of atoms and basis set size. A solution is to move to a radically different model of computing by building a quantum computer, which is a device that uses quantum systems themselves to store and process data. Here we report the application of the latest photonic quantum computer technology to calculate properties of the smallest molecular system: the hydrogen molecule in a minimal basis. We calculate the complete energy spectrum to 20 bits of precision and discuss how the technique can be expanded to solve large-scale chemical problems that lie beyond the reach of modern supercomputers. These results represent an early practical step toward a powerful tool with a broad range of quantum-chemical applications.

You can read a http://www.wired.com/wiredscience/2010/01/quantum-computer-hydrogen-simulation/" .

Zz.
 
Last edited by a moderator:
  • #98
D. W. Berry, et al., "Fair-sampling assumption is not necessary for testing local realism" Phys. Rev. A 81, 012109 (2010).

Abstract: Almost all Bell inequality experiments to date have used postselection and therefore relied on the fair sampling assumption for their interpretation. The standard form of the fair sampling assumption is that the loss is independent of the measurement settings, so the ensemble of detected systems provides a fair statistical sample of the total ensemble. This is often assumed to be needed to interpret Bell inequality experiments as ruling out hidden-variable theories. Here we show that it is not necessary; the loss can depend on measurement settings, provided the detection efficiency factorizes as a function of the measurement settings and any hidden variable. This condition implies that Tsirelson’s bound must be satisfied for entangled states. On the other hand, we show that it is possible for Tsirelson’s bound to be violated while the Clauser-Horne-Shimony-Holt (CHSH)-Bell inequality still holds for unentangled states, and present an experimentally feasible example.

Although I do not care much about interpretational issues and all that nonlocality vs. local realism stuff, a lot of people around here do. Therefore some people on these forums might be interested in this formal treatment on the meaning of fair sampling.
 
  • #99
Holger Müller, Achim Peters, & Steven Chu "A precision measurement of the gravitational redshift by the interference of matter waves", Nature v.463, p.926 (2010).

Abstract: One of the central predictions of metric theories of gravity, such as general relativity, is that a clock in a gravitational potential U will run more slowly by a factor of 1 + U/c^2, where c is the velocity of light, as compared to a similar clock outside the potential. This effect, known as gravitational redshift, is important to the operation of the global positioning system, timekeeping and future experiments with ultra-precise, space-based clocks (such as searches for variations in fundamental constants). The gravitational redshift has been measured using clocks on a tower, an aircraft and a rocket, currently reaching an accuracy of 7 × 10^-5. Here we show that laboratory experiments based on quantum interference of atoms enable a much more precise measurement, yielding an accuracy of 7 × 10^-9. Our result supports the view that gravity is a manifestation of space-time curvature, an underlying principle of general relativity that has come under scrutiny in connection with the search for a theory of quantum gravity. Improving the redshift measurement is particularly important because this test has been the least accurate among the experiments that are required to support curved space-time theories.

You may read a report on this work at the http://physicsworld.com/cws/article/news/41740" .

Also, note the name of one of the authors of this paper. There is a "Steven Chu", who is currently the Secretary of the US Dept. of Energy! :)

Zz.
 
Last edited by a moderator:
  • #100
H. Shishido et al., "Tuning the Dimensionality of the Heavy Fermion Compound CeIn3" Science v.327, p.980 (2010).

Abstract: Condensed-matter systems that are both low-dimensional and strongly interacting often exhibit unusual electronic properties. Strongly correlated electrons with greatly enhanced effective mass are present in heavy fermion compounds, whose electronic structure is essentially three-dimensional. We realized experimentally a two-dimensional heavy fermion system, adjusting the dimensionality in a controllable fashion. Artificial superlattices of the antiferromagnetic heavy fermion compound CeIn3 and the conventional metal LaIn3 were grown epitaxially. By reducing the thickness of the CeIn3 layers, the magnetic order was suppressed and the effective electron mass was further enhanced. Heavy fermions confined to two dimensions display striking deviations from the standard Fermi liquid low-temperature electronic properties, and these are associated with the dimensional tuning of quantum criticality.

Also see the Perspective article by Piers Coleman in the same issue.

This is a very interesting work since now, the "parameter" that is controlling the quantum phase transition is the dimensionality: 3D to 2D.

Zz.
 

Similar threads

Replies
5
Views
18K
Replies
0
Views
1K
Replies
3
Views
3K
Replies
5
Views
4K
Replies
2
Views
2K
Replies
7
Views
2K
Back
Top