Monday, November 22, 2010

NASA's Newest Microsatellite FASTSAT Launches Successfully

FASTSAT launches from Kodiak, Alaska Nov. 19, 2010.
Credit: Steven Young/Spaceflight Now
NASA's Fast, Affordable, Science and Technology Satellite, or FASTSAT, launched at 7:25 p.m. CST Friday aboard a Minotaur IV rocket from Kodiak Launch Complex on Kodiak Island, Alaska. FASTSAT is a unique platform that can carry multiple small payloads to low-Earth orbit creating opportunities for researchers to conduct low-cost scientific and technology research on an autonomous satellite in space.

FASTSAT separated from the Minotaur IV rocket approximately 22 minutes after launch, entering low-Earth orbit 406 miles above Earth and immediately began powering up the spacecraft. NASA ground stations are tracking the spacecraft and the spacecraft has been activated. The small satellite command center located at the Huntsville Operations and Science Control Center at NASA's Marshall Space Flight Center in Huntsville, Ala. is continuing commissioning operations of the satellite.

"This milestone is a testament to our FASTSAT team that worked tirelessly to design, build and test a fully functional, low-cost satellite in a year," said Mark Boudreaux, FASTSAT project manager at the Marshall Center. Boudreaux said the team maximized the number of payloads onboard and designed the satellite to support the Evolved Expendable Launch Vehicle Secondary Payload Adaptor (ESPA) ring to increase opportunities for ride sharing with Department of Defense ESPA configurable launch vehicles.

For the first 11 days after launch, the spacecraft and six onboard atmospheric and technology demonstration experiments will go through an on-orbit commissioning phase. Once commissioning is complete, the next 180 days will be focused on science operations. A checkout and performance analysis of each science instrument will be performed and then, one by one, each experiment will turn on to perform its science objectives.

After the science phase is complete, additional characterization of the spacecraft will be performed to test additional flight objectives. These operations will be performed in parallel to test the overall abilities of the spacecraft. This will occur for approximately 100 days. After completion of this phase of the mission, the command will be sent to shut down the spacecraft, which will go into a decommissioning phase.

The microsatellite FASTSAT, weighing just under 400 pounds, will serve as a scientific laboratory containing all the resources needed to carry out scientific and technology research operations for the mission time period. FASTSAT was developed with a simplicity in the design of the spacecraft subsystems that provide power management, onboard storage of experiments data, control of experiments, communications with ground stations, propellantless mechanisms for attitude control and a GPS system for navigation.

FASTSAT launched on the STP-S26 mission -- a joint activity between NASA and the U.S. Department of Defense Space Test Program. The satellite was designed, developed and tested at the Marshall Center in partnership with the Von Braun Center for Science & Innovation and Dynetics Inc. of Huntsville. Dynetics provided key engineering, manufacturing and ground operations support for the new microsatellite. Thirteen Huntsville-area firms, as well as the University of Alabama in Huntsville, also were part of the project team.
  
Source: Reprinted news release via NASA

Related Stories:

Saturday, November 20, 2010

Futuristic Taiwan Tower To Have Floating Observatories

Credit: DSBA
(PhysOrg.com) -- A futuristic tower called "Floating Observatories," which resembles a tree trunk with eight floating elevator observatories shaped like leaves, will soon become a major landmark in Taichung, Taiwan's third largest city.

The conceptual design of the tower was made by a team from the companies Dorin Stefan Birou Arhitectura (DSBA), Upgrade.Studio, and Mihai Cracium, and led by DSBA principal architect Stefan Dorin from Romania. The tower design won first prize in the recent Taiwan Tower Conceptual International Competition. Dorin explained the design represented a "technological tree," with elevator observatories shaped like the island of Taiwan, which is leaf-shaped.
The tower, standing over 300 meters high, will include an information center, museum, office tower, conference venue, fixed and floating observation decks, restaurants, and an urban park. Read more at Physorg

Related Stories:

Spitzer Reveals a Buried Explosion Sparked by a Galactic Train Wreck


Astronomers using NASA's Spitzer Space Telescope have found a stunning burst of star formation that beams out as much infrared light as an entire galaxy. The collision of two spiral galaxies has triggered this explosion, which is cloaked by dust that renders its stars nearly invisible in other wavelengths of light.

The starburst newly revealed by Spitzer stands as the most luminous ever seen taking place away from the centers, or nuclei, of merging parent galaxies. It blazes ten times brighter than the nearby Universe's previous most famous "off-nuclear starburst" that gleams in another galactic smashup known as the Antennae Galaxy.

The new findings show that galaxy mergers can pack a real star-making wallop far from the respective galactic centers, where star-forming dust and gases typically pool.

"This discovery proves that merging galaxies can generate powerful starbursts outside of the centers of the parent galaxies," says Hanae Inami, first author of a paper detailing the results in the July issue of The Astronomical Journal. Inami is a graduate student at The Graduate University for Advanced Studies in Japan and the Spitzer Science Center at the California Institute of Technology. She adds: "The infrared light emission of the starburst dominates its host galaxy and rivals that of the most luminous galaxies we see that are relatively close to our home, the Milky Way."

"No matter how you slice it, this starburst is one of the most luminous objects in the local Universe," agrees Lee Armus, second author of the paper and a senior research astronomer also at the Spitzer Science Center.

A dazzling galactic dust-up
Inami, Armus and their colleagues spotted the buried starburst with Spitzer in the interacting galaxies known as II Zw 096. This galactic train wreck - located around 500 million light years away in the constellation Delphinus (the Dolphin) - will continue to unfold for a few hundred million years. Gravitational forces have already dissolved the once-pinwheel shape of one of II Zw 096's pair of merging galaxies.

The ultra-bright starburst region spans 700 light-years or so - just a tiny portion of II Zw 096, which streams across some 50,000 to 60,000 light-years - yet it blasts out 80 percent of the infrared light from this galactic tumult. Based on Spitzer data, researchers estimate the starburst is cranking out stars at the breakneck pace of around 100 solar masses, or masses of our Sun, per year.

The prodigious energy output of this starburst in a decentralized location as revealed in the infrared has surprised the Spitzer researchers. The new observations go to show how the notion of a cosmic object's nature can change tremendously when viewed at different wavelengths of light. In this way, the shapes and dynamics of distant, harder-to-study galactic mergers could turn out to be a good deal more complex than current observations over a narrow range of wavelengths imply.

"Most of the far-infrared emission in II Zw 096, and hence most of the power, is coming from a region that is not associated with the centers of the merging galaxies," Inami explains. "This suggests that the appearances and interactions of distant, early galaxies during epochs when mergers were much more common than today in the Universe might be more complicated than we think."

A fleeting, perhaps prophetic vista?
In galaxy mergers, individual stars rarely slam into one another because of the vast distances separating them; even in the comparatively crowded central hubs of spiral galaxies, trillions of kilometers still often yawn between the stars.

But giant, diffuse clouds of gas and dust in galaxies do crash together - passing through each other somewhat like ocean waves - and in turn spur the gravitational collapse of dense pockets of matter into new stars. These young, hot stars shine intensely in the energetic ultraviolet part of the spectrum. In the case of II Zw 096, however, a thick shroud of gas and dust still surrounds this stellar brood. The blanket of material absorbs the stars' light and re-radiates it in the lower-energy, infrared wavelengths that gleam clear through the dust to Spitzer's camera.

Astronomers were lucky to capture this transient phase in the evolution of the starburst and of the daughter galaxy that will eventually coalesce out of the collision. "Spitzer has allowed us to see the fireworks before all the gas and dust has cleared away, giving us a preview of the exciting new galaxy being built under the blanket," Inami says.

Merging galaxies such as II Zw 096 also offer a sneak peek at the fate of our Milky Way in some 4.5 billion years when it is expected to plow into its nearest large galactic neighbor, the Andromeda Galaxy. Off-nuclear starbursts such as that in II Zw 096 and the Antennae Galaxy could occur in the vicinity of our Solar System, perhaps, which is located about two-thirds of the way out from the Milky Way's glowing, bulging center.

"This kind of dramatic thing happening in II Zw 096 could happen to the Milky Way and Andromeda when they meet in the far future," says Inami.


Source: Reprinted news release via NASA Spitzer

Related Stories:

New Microscope Reveals Ultrastructure Of Cells

Slice through the nucleus of a mouse adenocarcinoma cell
showing the nucleolus (NU) and the membrane channels running
across the nucleus (NMC); taken by X-ray nanotomography.
Photo: HZB/Schneider
For the first time, there is no need to chemically fix, stain or cut cells in order to study them. Instead, whole living cells are fast-frozen and studied in their natural environment. The new method delivers an immediate 3-D image, thereby closing a gap between conventional microscopic techniques.

The new microscope delivers a high-resolution 3-D image of the entire cell in one step. This is an advantage over electron microscopy, in which a 3-D image is ass
embled out of many thin sections. This can take up to weeks for just one cell. Also, the cell need not be labelled with dyes, unlike in fluorescence microscopy, where only the labelled structures become visible. The new X-ray microscope instead exploits the natural contrast between organic material and water to form an image of all cell structures. Dr. Gerd Schneider and his microscopy team at the Institute for Soft Matter and Functional Materials have published their development in Nature Methods (DOI:10.1038/nmeth.1533).

With the high resolution achieved by their microscope, the researchers, in cooperation with colleagues of the National Cancer Institute in the USA, have reconstructed mouse adenocarcinoma cells in three dimensions. The smallest of details were visible: the double membrane of the cell nucleus, nuclear pores in the nuclear envelope, membrane channels in the nucleus, numerous inva­ginations of the inner mitochondrial membrane and inclusions in cell organelles such as lysosomes. Such insights will be crucial for shedding light on inner-cellular processes: such as how viruses or nanoparticles penetrate into cells or into the nucleus, for example.

This is the first time the so-called ultrastructure of cells has been imaged with X-rays to such precision, down to 30 nanometres. Ten nanometres are about one ten-thousandth of the width of a human hair. Ultrastructure is the detailed structure of a biological specimen that is too small to be seen with an optical microscope.

Researchers achieved this high 3-D resolution by illuminating the minute structures of the frozen-hydrated object with partially coherent light. This light is generated by BESSY II, the synchrotron source at HZB. Partial coherence is the property of two waves whose relative phase undergoes random fluctuations which are not, however, sufficient to make the wave completely incoherent. Illumination with partial coherent light generates significantly higher contrast for small object details compared to incoherent illumination. Combining this approach with a high-resolution lens, the researchers were able to visualize the ultrastructures of cells at hitherto unattained contrast.

The new X-ray microscope also allows for more space around the sample, which leads to a better spatial view. This space has always been greatly limited by the setup for the sample illumination. The required monochromatic X-ray light was created using a radial grid and then, from this light, a diaphragm would select the desired range of wavelengths. The diaphragm had to be placed so close to the sample that there was almost no space to turn the sample around. The researchers modified this setup: Monochromatic light is collected by a new type of condenser which directly illuminates the object, and the diaphragm is no longer needed. This allows the sample to be turned by up to 158 degrees and observed in three dimensions. These developments provide a new tool in structural biology for the better understanding of the cell structure.

Source: Reprinted news release via Helmholtz Association of German Research Centres

Related Stories:

Hubble Captures New 'Life' in an Ancient Galaxy

Elliptical galaxies were once thought to be aging star cities whose star-making heyday was billions of years ago.

But new observations with NASA's Hubble Space Telescope are helping to show that elliptical galaxies still have some youthful vigor left, thanks to encounters with smaller galaxies.

Images of the core of NGC 4150, taken in near-ultraviolet light with the sharp-eyed Wide Field Camera 3 (WFC3), reveal streamers of dust and gas and clumps of young, blue stars that are significantly less than a billion years old. Evidence shows that the star birth was sparked by a merger with a dwarf galaxy.

The new study helps bolster the emerging view that most elliptical galaxies have young stars, bringing new life to old galaxies.

"Elliptical galaxies were thought to have made all of their stars billions of years ago," says astronomer Mark Crockett of the University of Oxford, leader of the Hubble observations. "They had consumed all their gas to make new stars. Now we are finding evidence of star birth in many elliptical galaxies, fueled mostly by cannibalizing smaller galaxies.

"These observations support the theory that galaxies built themselves up over billions of years by collisions with dwarf galaxies," Crockett continues. "NGC 4150 is a dramatic example in our galactic back yard of a common occurrence in the early universe."

The Hubble images reveal turbulent activity deep inside the galaxy's core. Clusters of young, blue stars trace a ring around the center that is rotating with the galaxy. The stellar breeding ground is about 1,300 light-years across. Long strands of dust are silhouetted against the yellowish core, which is composed of populations of older stars.

From a Hubble analysis of the stars' colors, Crockett and his team calculated that the star-formation boom started about a billion years ago, a comparatively recent event in cosmological history. The galaxy's star-making factory has slowed down since then.

"We are seeing this galaxy after the major starburst has occurred," explains team member Joseph Silk of the University of Oxford. "The most massive stars are already gone. The youngest stars are between 50 million and 300 to 400 million years old. By comparison, most of the stars in the galaxy are around 10 billion years old."

The encounter that triggered the star birth would have been similar to our Milky Way swallowing the nearby Large Magellanic Cloud.

"We believe that a merger with a small, gas-rich galaxy around one billion years ago supplied NGC 4150 with the fuel necessary to form new stars," says team member Sugata Kaviraj of the Imperial College London and the University of Oxford. "The abundance of 'metals' -- elements heavier than hydrogen and helium—in the young stars is very low, suggesting the galaxy that merged with NGC 4150 was also metal-poor. This points towards a small, dwarf galaxy, around one-twentieth the mass of NGC 4150."

Minor mergers such as this one are more ubiquitous than interactions between hefty galaxies, the astronomers say. For every major encounter, there are probably up to 10 times more frequent clashes between a large and a small galaxy. Major collisions are easier to see because they create incredible fireworks: distorted galaxies, long streamers of gas, and dozens of young star clusters. Smaller interactions are harder to detect because they leave relatively little trace.

Over the past five years, however, ground- and space-based telescopes have offered hints of fresh star formation in elliptical galaxies. Ground-based observatories captured the blue glow of stars in elliptical galaxies, and satellites such as the Galaxy Evolution Explorer (GALEX), which looks in far- and near-ultraviolet light, confirmed that the blue glow came from fledgling stars much less than a billion years old. Ultraviolet light traces the glow of hot, young stars.

Crockett and his team selected NGC 4150 for their Hubble study because a ground-based spectroscopic analysis gave tantalizing hints that the galaxy's core was not a quiet place. The ground-based survey, called the Spectrographic Areal Unit for Research on Optical Nebulae (SAURON), revealed the presence of young stars and dynamic activity that was out of sync with the galaxy.

"In visible light, elliptical galaxies such as NGC 4150 look like normal elliptical galaxies," Silk says. "But the picture changes when we look in ultraviolet light. At least a third of all elliptical galaxies glow with the blue light of young stars."

Adds Crockett: "Ellipticals are the perfect laboratory for studying minor mergers in ultraviolet light because they are dominated by old red stars, allowing astronomers to see the faint blue glow of young stars."

The astronomers hope to study other elliptical galaxies in the SAURON survey to look for the signposts of new star birth. The team's results have been accepted for publication in The Astrophysical Journal.

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI) conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington, D.C.

Source: Reprinted news release via NASA 

Related Stories:

Friday, November 19, 2010

Trained Bacteria Convert Bio-Wastes Into Plastic

Bacteria in training. Credit: B-Basic consortium
Researcher Jean-Paul Meijnen has 'trained' bacteria to convert all the main sugars in vegetable, fruit and garden waste efficiently into high-quality environmentally friendly products such as bioplastics.

He will be defending his doctoral thesis on this topic, which was carried out in the context of the NWO B-Basic programme, at TU Delft on Monday 22 November 2010.

There is considerable interest in bioplastics nowadays. The technical problems associated with turning potato peel into sunglasses, or cane sugar into car bumpers, have already been solved. The current methods, however, are not very efficient: only a small percentage of the sugars can be converted into valuable products. By adapting the eating pattern of bacteria and subsequently training them, Meijnen has succeeded in converting sugars in processable materials, so that no bio-waste is wasted.

Basis for bioplastics
The favoured raw materials for such processes are biological wastes left over from food production. Lignocellulose, the complex combination of lignin and cellulose present in the stalks and leaves of plants that gives them their rigidity, is such a material. Hydrolysis of lignocellulose breaks down the long sugar chains that form the backbone of this material, releasing the individual sugar molecules. These sugar molecules can be further processed by bacteria and other micro-organisms to form chemicals that can be used as the basis for bioplastics. The fruit of the plant, such as maize, can be consumed as food, while the unused waste such as lignocellulose forms the raw material for bioplastics.

Cutting the price of the process
"Unfortunately, the production of plastics from bio-wastes is still quite an expensive process, because the waste material is not fully utilized," explains Jean-Paul Meijnen. (It should be noted here that we are talking about agricultural bio-wastes in this context, not the garden waste recycled by households.) The pre-treatment of these bio-wastes leads to the production of various types of sugars such as glucose, xylose and arabinose. These three together make up about eighty per cent of the sugars in bio-waste.

The problem is that the bacteria Meijnen was working with, Pseudomonas putida S12, can only digest glucose but not xylose or arabinose. As a result, a quarter of the eighty per cent remains unused. "A logical way of reducing the cost price of bioplastics is thus to 'teach' the bacteria to digest xylose and arabinose too."

Enzymes
The xylose has to be 'prepared' before Pseudomonas putida S12 can digest it. This is done with the aid of certain enzymes. The bacteria are genetically modified by inserting specific DNA fragments in the cell; this enables them to produce enzymes that assist in the conversion of xylose into a molecule that the bacteria can deal with.

Meijnen achieved this by introducing two genes from another bacterium (E. coli) which code for two enzymes that enable xylose to be converted in a two-stage process into a molecule that P. putida S12 can digest.

Evolution
This method did work, but not very efficiently: only twenty per cent of the xylose present was digested. The modified bacteria were therefore 'trained' to digest more xylose. Meijnen did this by subjecting the bacteria to an evolutionary process, successively selecting the bacteria that showed the best performance.

"After three months of this improvement process, the bacteria could quickly digest all the xylose present in the medium. And surprisingly enough, these trained bacteria could also digest arabinose, and were thus capable of dealing with the three principal sugars in bio-wastes." Meijnen also incorporated other genes, from the bacterium Caulobacter crescentus. This procedure also proved effective and efficient from the start.
Blend

Finally, in a separate project Meijnen succeeded in modifying a strain of Pseudomonas putida S12 that had previously been modified to produce para-hydroxybenzoate (pHB), a member of the class of chemicals known as parabens that are widely used as preservatives in the cosmetics and pharmaceutical industries.

Meijnen tested the ability of these bacteria to produce pHB, a biochemical substance, from xylose and from other sources such as glucose and glycerol. He summarized his results as follows: "This strategy also proved successful, allowing us to make biochemical substances such as pHB from glucose, glycerol and xylose. In fact, the use of mixtures of glucose and xylose, or glycerol and xylose, gives better pHB production than the use of unmixed starting materials. This means that giving the bacteria pretreated bio-wastes as starting material stimulates them to make even more pHB."

Source: Reprinted news release via Delft University of Technology

Related Stories:

Thursday, November 18, 2010

Huge Planet From Another Galaxy Discovered

This artist's impression shows HIP 13044 b, an exoplanet orbiting a star that entered our galaxy, the Milky Way, from another galaxy. This planet of extragalactic origin was detected by a European team of astronomers using the MPG/ESO 2.2-meter telescope at ESO's La Silla Observatory in Chile. The Jupiter-like planet is particularly unusual, as it is orbiting a star nearing the end of its life and could be about to be engulfed by it, giving clues about the fate of our own planetary system in the distant future. Credit: ESO/L. Calçada
Over the last 15 years, astronomers have detected nearly 500 planets orbiting stars in our cosmic neighbourhood, but none outside our Milky Way has been confirmed. Now, however, a planet with a minimum mass 1.25 times that of Jupiter has been discovered orbiting a star of extragalactic origin, even though the star now finds itself within our own galaxy.

It is part of the so-called Helmi stream — a group of stars that originally belonged to a dwarf galaxy that was devoured by our galaxy, the Milky Way, in an act of galactic cannibalism about six to nine billion years ago. The results are published today in Science Express.

"This discovery is very exciting," says Rainer Klement of the Max-Planck-Institut für Astronomie (MPIA), who was responsible for the selection of the target stars for this study. "For the first time, astronomers have detected a planetary system in a stellar stream of extragalactic origin. Because of the great distances involved, there are no confirmed detections of planets in other galaxies. But this cosmic merger has brought an extragalactic planet within our reach."

The star is known as HIP 13044, and it lies about 2000 light-years from Earth in the southern constellation of Fornax (the Furnace). The astronomers detected the planet, called HIP 13044 b, by looking for the tiny telltale wobbles of the star caused by the gravitational tug of an orbiting companion. For these precise observations, the team used the high-resolution spectrograph FEROS [3] attached to the 2.2-metre MPG/ESO telescope [4] at ESO's La Silla Observatory in Chile.

Adding to its claim to fame, HIP 13044 b is also one of the few exoplanets known to have survived the period when its host star expanded massively after exhausting the hydrogen fuel supply in its core — the red giant phase of stellar evolution. The star has now contracted again and is burning helium in its core. Until now, these so-called horizontal branch stars have remained largely uncharted territory for planet-hunters.

"This discovery is part of a study where we are systematically searching for exoplanets that orbit stars nearing the end of their lives," says Johny Setiawan, also from MPIA, who led the research. "This discovery is particularly intriguing when we consider the distant future of our own planetary system, as the Sun is also expected to become a red giant in about five billion years."

HIP 13044 b is near to its host star. At the closest point in its elliptical orbit, it is less than one stellar diameter from the surface of the star (or 0.055 times the Sun-Earth distance). It completes an orbit in only 16.2 days. Setiawan and his colleagues hypothesise that the planet's orbit might initially have been much larger, but that it moved inwards during the red giant phase.

Any closer-in planets may not have been so lucky. "The star is rotating relatively quickly for an horizontal branch star," says Setiawan. "One explanation is that HIP 13044 swallowed its inner planets during the red giant phase, which would make the star spin more quickly."

Although HIP 13044 b has escaped the fate of these inner planets so far, the star will expand again in the next stage of its evolution. HIP 13044 b may therefore be about to be engulfed by the star, meaning that it is doomed after all. This could also foretell the demise of our outer planets — such as Jupiter — when the Sun approaches the end of its life.

The star also poses interesting questions about how giant planets form, as it appears to contain very few elements heavier than hydrogen and helium — fewer than any other star known to host planets. "It is a puzzle for the widely accepted model of planet formation to explain how such a star, which contains hardly any heavy elements at all, could have formed a planet. Planets around stars like this must probably form in a different way," adds Setiawan.

Source: Reprinted news release via ESO

 Related Stories:

Magnetic Trapping Will Help Unlock The Secrets Of Anti-Matter

A clearer understanding of the Universe, its origins and maybe even its destiny is a significant step closer, thanks to new research.

As part of a major international experiment called ALPHA*, based at CERN in Switzerland, researchers have helped to achieve trapping and holding atoms of 'anti-hydrogen', which has not previously been possible.

The project involves physicists at Swansea University led by Professor Mike Charlton, Dr Niels Madsen and Dr Dirk Peter van der Werf and the University of Liverpool under Professor Paul Nolan, all supported by the Engineering and Physical Sciences Research Council (EPSRC).

This breakthrough will make it possible to study 'anti-matter' closely for the first time, and so develop unprecedented insight into its composition/structure and improve understanding of the fundamental physical principles that underpin the Universe and the way it works.

For nearly a decade, scientists have been able to undertake the controlled production of anti-hydrogen atoms in the laboratory – a breakthrough which Swansea University also contributed to, with EPSRC support**. But as anti-matter particles are instantly annihilated when they come into contact with matter, it has not, until now, been feasible to study anti-hydrogen atoms in any detail.

ALPHA has therefore developed techniques that not only cool and slow down the anti-particles that make up anti-hydrogen and gently mix them to produce anti-hydrogen atoms, but also trap some of the anti-atoms for long enough so they can be studied.

The key focus of this effort has been the development of electromagnetic traps that have a number of cold species inside. These traps don't just provide the conditions needed to cool the anti-particles prior to mixing. The cold anti-atoms formed also have a tiny 'magnetic moment'*** which means they respond to magnetic fields. By arranging the magnet coils in the right way, it is possible to set up a magnetic 'well' in the centre of the anti-particle mixing zone where anti-hydrogen has been trapped.

"Every type of particle has its anti-matter equivalent which is its mirror image in terms of having, for instance, the opposite electrical charge" says Professor Charlton. "Because hydrogen is the simplest of all atoms, anti-hydrogen is the easiest type of anti-matter to produce in the laboratory. By studying it for the first time, we will be able to understand its properties and establish whether it really is the exact mirror image of hydrogen.

"That understanding will hopefully enable us to shed light on exactly why almost everything in the known Universe consists of matter, rather than anti-matter, and what the implications are in terms of the fundamental way that the Universe functions."

In order to detect the anti-hydrogen atoms they were released from the trap. The silicon detector used to determine the positions of the resulting annihilations was developed and built at Liverpool. Professor Nolan comments that "the unique clean room and workshop facilities in Liverpool, together with detector and electronics expertise, allowed us to build this complex and unique instrument that is now part of the ALPHA experiment."

Dr Niels Madsen notes: "Trapping of anti-hydrogen is a major breakthrough in antimatter physics. Having the anti-atoms trapped will allow for comparisons of matter and anti-matter to a level that until now would have been considered wishful thinking."

The initiative is expected to run for several years, with ALPHA commencing tests on anti-hydrogen atoms in around five years time. 

Source: Reprinted news release via Engineering and Physical Sciences Research Council

Related Stories:

Researchers Uncover Surprise Link Between Weird Quantum Phenomena

Researchers have uncovered a fundamental link between the two defining properties of quantum physics. Stephanie Wehner of Singapore's Centre for Quantum Technologies and the National University of Singapore and Jonathan Oppenheim of the United Kingdom's University of Cambridge published their work today in the latest edition of the journal Science.

The result is being heralded as a dramatic breakthrough in our basic understanding of quantum mechanics and provides new clues to researchers seeking to understand the foundations of quantum theory. The result addresses the question of why quantum behaviour is as weird as it is—but no weirder.

The strange behaviour of quantum particles, such as atoms, electrons and the photons that make up light, has perplexed scientists for nearly a century. Albert Einstein was among those who thought the quantum world was so strange that quantum theory must be wrong, but experiments have borne out the theory's predictions.

One of the weird aspects of quantum theory is that it is impossible to know certain things, such as a particle's momentum and position, simultaneously. Knowledge of one of these properties affects the accuracy with which you can learn the other. This is known as the "Heisenberg Uncertainty Principle".

Another weird aspect is the quantum phenomenon of non-locality, which arises from the better-known phenomenon of entanglement. When two quantum particles are entangled, they can perform actions that look as if they are coordinated with each other in ways that defy classical intuition about physically separated particles.

Previously, researchers have treated non-locality and uncertainty as two separate phenomena. Now Wehner and Oppenheim have shown that they are intricately linked. What's more, they show that this link is quantitative and have found an equation which shows that the "amount" of non-locality is determined by the uncertainty principle.

"It's a surprising and perhaps ironic twist," said Oppenheim, a Royal Society University Research Fellow from the Department of Applied Mathematics & Theoretical Physics at the University of Cambridge. Einstein and his co-workers discovered non-locality while searching for a way to undermine the uncertainty principle. "Now the uncertainty principle appears to be biting back."

Non-locality determines how well two distant parties can coordinate their actions without sending each other information. Physicists believe that even in quantum mechanics, information cannot travel faster than light. Nevertheless, it turns out that quantum mechanics allows two parties to coordinate much better than would be possible under the laws of classical physics. In fact, their actions can be coordinated in a way that almost seems as if they had been able to talk. Einstein famously referred to this phenomenon as "spooky action at a distance".

However, quantum non-locality could be even spookier than it actually is. It's possible to have theories which allow distant parties to coordinate their actions much better than nature allows, while still not allowing information to travel faster than light. Nature could be weirder, and yet it isn't – quantum theory appears to impose an additional limit on the weirdness.

"Quantum theory is pretty weird, but it isn't as weird as it could be. We really have to ask ourselves, why is quantum mechanics this limited? Why doesn't nature allow even stronger non-locality?" Oppenheim says.

The surprising result by Wehner and Oppenheim is that the uncertainty principle provides an answer. Two parties can only coordinate their actions better if they break the uncertainty principle, which imposes a strict bound on how strong non-locality can be.

"It would be great if we could better coordinate our actions over long distances, as it would enable us to solve many information processing tasks very efficiently," Wehner says. "However, physics would be fundamentally different. If we break the uncertainty principle, there is really no telling what our world would look like."

How did the researchers discover a connection that had gone unnoticed so long? Before entering academia, Wehner worked as a 'computer hacker for hire', and now works in quantum information theory, while Oppenheim is a physicist. Wehner thinks that applying techniques from computer science to the laws of theoretical physics was key to spotting the connection. "I think one of the crucial ideas is to link the question to a coding problem," Wehner says. "Traditional ways of viewing non-locality and uncertainty obscured the close connection between the two concepts."

Wehner and Oppenheim recast the phenomena of quantum physics in terms that would be familiar to a computer hacker. They treat non-locality as the result of one party, Alice, creating and encoding information and a second party, Bob, retrieving information from the encoding. How well Alice and Bob can encode and retrieve information is determined by uncertainty relations. In some situations, they found that and a third property known as "steering" enters the picture.

Wehner and Oppenheim compare their discovery to uncovering what determines how easily two players can win a quantum board game: the board has only two squares, on which Alice, can place a counter of two possible colours: green or pink. She is told to place the same colour on both squares, or to place a different colour on each. Bob has to guess the colour that Alice put on square one or two. If his guess is correct, Alice and Bob win the game. Clearly, Alice and Bob could win the game if they could talk to each other: Alice would simply tell Bob what colours are on squares one and two. But Bob and Alice are situated so far apart from each other that light – and thus an information-carrying signal – does not have time to pass between them during the game.

If they can't talk, they won't always win, but by measuring on quantum particles, they can win the game more often than any strategy which doesn't rely on quantum theory. However, the uncertainty principle prevents them from doing any better, and even determines how often they lose the game.

The finding bears on the deep question of what principles underlie quantum physics. Many attempts to understand the underpinnings of quantum mechanics have focused on non-locality. Wehner thinks there may be more to gain from examining the details of the uncertainty principle. "However, we have barely scratched the surface of understanding uncertainty relations," she says.

The breakthrough is future-proof, the researchers say. Scientists are still searching for a quantum theory of gravity and Wehner and Oppenheim's result concerning non-locality, uncertainty and steering applies to all possible theories – including any future replacement for quantum mechanics.

Source: Reprinted news release via Centre for Quantum Technologies at the National University of Singapore

Related Stories:

NASA Spacecraft Sees Cosmic Snow Storm During Comet Encounter

Infrared scans of comet Hartley 2 by NASA's EPOXI mission spacecraft show carbon dioxide, dust, and ice being distributed in a similar way and emanating from apparently the same locations on the nucleus. Water vapor, however, has a different distribution implying a different source region and process. Image Credit: NASA/JPL-Caltech/UMD
PASADENA, Calif. -- The EPOXI mission's recent encounter with comet Hartley 2 provided the first images clear enough for scientists to link jets of dust and gas with specific surface features. NASA and other scientists have begun to analyze the images.

The EPOXI mission spacecraft revealed a cometary snow storm created by carbon dioxide jets spewing out tons of golf-ball to basketball-sized fluffy ice particles from the peanut-shaped comet's rocky ends. At the same time, a different process was causing water vapor to escape from the comet's smooth mid-section. This information sheds new light on the nature of comets and even planets.

Scientists compared the new data to data from a comet the spacecraft previously visited that was somewhat different from Hartley 2. In 2005, the spacecraft successfully released an impactor into the path of comet Tempel 1, while observing it during a flyby.

"This is the first time we've ever seen individual chunks of ice in the cloud around a comet or jets definitively powered by carbon dioxide gas," said Michael A'Hearn, principal investigator for the spacecraft at the University of Maryland. "We looked for, but didn't see, such ice particles around comet Tempel 1."

The new findings show Hartley 2 acts differently than Tempel 1 or the three other comets with nuclei imaged by spacecraft. Carbon dioxide appears to be a key to understanding Hartley 2 and explains why the smooth and rough areas scientists saw respond differently to solar heating, and have different mechanisms by which water escapes from the comet's interior.

"When we first saw all the specks surrounding the nucleus, our mouths dropped," said Pete Schultz, EPOXI mission co-investigator at Brown University. "Stereo images reveal there are snowballs in front and behind the nucleus, making it look like a scene in one of those crystal snow globes."

Data show the smooth area of comet Hartley 2 looks and behaves like most of the surface of comet Tempel 1, with water evaporating below the surface and percolating out through the dust. However, the rough areas of Hartley 2, with carbon dioxide jets spraying out ice particles, are very different.

"The carbon dioxide jets blast out water ice from specific locations in the rough areas resulting in a cloud of ice and snow," said Jessica Sunshine, EPOXI deputy principal investigator at the University of Maryland. "Underneath the smooth middle area, water ice turns into water vapor that flows through the porous material, with the result that close to the comet in this area we see a lot of water vapor."

Engineers at NASA's Jet Propulsion Laboratory in Pasadena, Calif., have been looking for signs ice particles peppered the spacecraft. So far they found nine times when particles, estimated to weigh slightly less than the mass of a snowflake, might have hit the spacecraft but did not damage it.

"The EPOXI mission spacecraft sailed through Hartley 2's ice flurries in fine working order and continues to take images as planned of this amazing comet," said Tim Larson, EPOXI project manager at JPL.

Scientists will need more detailed analysis to determine how long this snow storm has been active, and whether the differences in activity between the middle and ends of the comet are the result of how it formed some 4.5 billion years ago or are because of more recent evolutionary effects.

EPOXI is a combination of the names for the mission's two components: the Extrasolar Planet Observations and Characterization (EPOCh), and the flyby of comet Hartley 2, called the Deep Impact Extended Investigation (DIXI).

JPL manages the EPOXI mission for the Science Mission Directorate at NASA Headquarters in Washington. The spacecraft was built for NASA by Ball Aerospace & Technologies Corp., in Boulder, Colo.

For more information about EPOXI, visit: http://www.nasa.gov/epoxi

Source: Reprinted news release via NASA

Related Stories:

Hayabusa Spacecraft Returns Asteroid Artifacts From Space





 
Scientists involved with the first space mission attempting to sample asteroid surface material and return to Earth, have confirmed presence of particles collected from a small container aboard the Japan Aerospace Exploration Agency (JAXA) Hayabusa spacecraft.

On June 14, 2010 (local time), Hayabusa landed in the remote Woomera Test Range in South Australia, concluding a remarkable mission of exploration -- one in which NASA scientists and engineers played a contributing role.

Initial research from an electron microscope reveals about 1500 grains identified as rocky particles, and judged to be of extraterrestrial origin from the asteroid. Their size is mostly less than 10 micrometers. Handling these grains requires very special skills and techniques. JAXA is developing the necessary handling techniques and preparing the associated equipment for further analyses.

Launched May 9, 2003, from the Kagoshima Space Center, Uchinoura, Japan, Hayabusa was designed as a flying testbed. Its mission: to research several new engineering technologies necessary for returning planetary samples to Earth for further study. With Hayabusa, JAXA scientists and engineers hoped to obtain detailed information on electrical propulsion and autonomous navigation, as well as an asteroid sampler and sample reentry capsule.

The 510-kilogram (950-pound) Hayabusa spacecraft rendezvoused with asteroid Itokawa in September 2005. Over the next two-and-a-half months, the spacecraft made up-close and personal scientific observations of the asteroid's shape, terrain, surface altitude distribution, mineral composition, gravity, and the way it reflected the sun's rays. On Nov. 25 of that year, Hayabusa briefly touched down on the surface of Itokawa. That was only the second time in history a spacecraft descended to the surface of an asteroid (NASA's Near Earth Asteroid Rendezvous-Shoemaker spacecraft landed on asteroid Eros on Feb. 12, 2001).

The spacecraft departed Itokawa in January 2007. A team of Japanese and American navigators guided Hayabusa on the final leg of its journey. Together, they calculated the final trajectory correction maneuvers Hayabusa's ion propulsion system had to perform for a successful homecoming.

To obtain the data they needed, the navigation team frequently called upon JAXA's tracking stations in Japan, as well as those of NASA's Deep Space Network, which has antennas at Goldstone, in California's Mojave Desert; near Madrid, Spain; and near Canberra, Australia. In addition, the stations provided mission planners with near-continuous communications with the spacecraft to keep them informed on spacecraft health.

After the spacecraft returned, team members retrieved it and transported it to JAXA's sample curatorial facility in Sagamihara, Japan. There, Japanese astromaterials scientists, assisted by two scientists from NASA and one from Australia, performed preliminary cataloging and analysis of the capsule's contents.

Image: JAXA’s Hayabusa spacecraft leaves a streak of light behind the clouds is as it re-enters Earth’s atmosphere over the Woomera Test Range in Australia. Credit: NASA/Ed Schilling

For more information about the Hayabusa mission, visit: http://www.isas.jaxa.jp/e/enterp/missions/hayabusa/index.shtml. 

Source: Reprinted news release via NASA

Related Stories:

Wednesday, November 17, 2010

What Will Threaten Us In 2040? Tiny Robots?

Could terrorists of the future use a swarm of tiny robots — less an a quarter-inch high — to attack their targets? Will new bio materials be able to target individuals carrying specific genetic markers? Could cyber-attackers melt down a nuclear facility with the press of a "return" key, or implant chips to control our minds?

These scenarios may sound like science fiction, but according to Dr. Yair Sharan, Director of the Interdisciplinary Center for Technological Analysis and Forecasting (ICTAF) at Tel Aviv University, they're all within the realm of possibility in the next few decades. That's why it's critical for nations to be aware of the risks, and primed to mitigate them to avert another 9/11 or Mumbai terror attack.

As head of a pan-European project called FESTOS (Foresight of Evolving Security Threats Posed by Emerging Technologies: http://www.festos.org), Dr. Sharan and his colleagues are looking 30 years into the future to determine what our real technological threats will be. At the end of their three-year project, already underway, they'll issue a detailed task report to describe the threats and suggest to leaders of democratic nations how they can avoid them.

Forecasting disaster
Part of ICTAF's work looks for "signals" in politics, news reports, and advanced high-tech coverage to assess what technologies and applications could be used for future crime and terror. "While America did not foresee the scale of 9/11, the signs were there that such an act was a possible event," says Dr. Sharan. He calls 9/11 an example of a "wild card" –– an event or scenario with a low probability and a very high impact. "Our mission is to forecast wild card calamities, natural and manmade, so that nations can be alert and poised to avoid human casualties."

The FESTOS team's method also uses questionnaires and interviews with 250 experts from the United States and Europe in a variety of disciplines including chemistry, robotics and computer sciences. The research team analyzes the data to determine and classify future threats, and proposes strategies to mitigate the risks.

At Tel Aviv University, researchers dig into the numbers to estimate threat probabilities. With the input of technology pioneers and scientists, they are exploring what inventions might be available that are meant to improve our lives, but have the potential to be used for malicious purposes. They are "technology mapping," looking into possibilities such as robot terrorists, dangerous new chemicals, and pioneering materials born of biotech and nanotech.

The probability is that tomorrow's terror attacks will be information technology-related, Dr. Sharan predicts. Forecasters envision an attack on a country's energy supply, or a cyber attack on a major airport, especially since hackers of the White House and the Iran nuclear facility have shown how vulnerable critical infrastructure systems can be.

Experience with terrorism provides an advantage
Unfortunately, Dr. Sharan observes, democratic nations like the United States, the United Kingdom, and Spain have learned over the last decade that threats from terror are not limited to Israel. But Israel's unrelenting experience with terrorism, and Tel Aviv University's demonstrated expertise in forecasting, have created a laboratory for work that can have a profound impact on Western policy making and planning. And knowing what's possible will arm future leaders with the tools to protect their citizens.

After the forecasts in the FESTOS study are collected, the results will be shared with decision and policy makers in governments in Europe, Israel, the USA and other democratic nations. Policy makers will then be able to prepare for "foreseen" surprises.

Tel Aviv University is also taking a leading role in another significant foresight project. ICTAF centre now heads the Israel component of the Millennium Project — previously under the auspices of the United Nations — to assess the future state of the world in the areas of politics, science and technology, health practice, and economics.

Source: Reprinted news release via American Friends of Tel Aviv University


Related Stories:

International Research Team Trap 38 Antimatter Atoms

Angels & Demons (Single-Disc Theatrical Edition)In the movie Angels and Demons, scientists have solved one of the most perplexing scientific problems: the capture and storage of antimatter. In real life, trapping atomic antimatter has never been accomplished, until now.

A team made up of researchers from the University of Calgary, institutions across Canada and around the world have discovered how to trap atomic antimatter and the results of their discovery is published in the journal Nature.

“This is a major discovery. It could enable experiments that result in dramatic changes to the current view of fundamental physics or in confirmation of what we already know now,” says Dr. Rob Thompson, head of physics and astronomy at the University of Calgary and co-investigator in the ALPHA collaboration, one of two teams competing to gain a better understanding of antimatter and our universe.

Both teams, ALPHA and the Harvard-led ATRAP, have been at this race for over five years, conducting experiments in close quarters at CERN (European Organization for Nuclear Research), the world's largest particle physics lab, located in the suburbs of Geneva, Switzerland. CERN is the only laboratory in the world with the proper equipment where this research can be carried out.

“These are significant steps in antimatter research,” said CERN Director General Dr. Rolf Heuer, “and an important part of the very broad research programme at CERN.”

The goal of the competition involves trapping and storing the simplest of all antimatter atoms, antihydrogen, with the purpose of studying it. Hydrogen is the lightest and most abundant chemical element.

“We know a lot about matter, but very little about antimatter. We assume there was as much antimatter created in the Big Bang as matter. There are many questions. Where is the antimatter? Where did it go? And why does it appear that there is more matter than antimatter?”  says Dr. Makoto Fujiwara, adjunct professor in the Department of Physics and Astronomy at the University of Calgary and a research scientist at TRIUMF, Canada's national laboratory for particle and nuclear physics.

ALPHA-Canada scientists and students have been playing leading roles in the experiment. “It's been a rare privilege and a tremendous learning experience taking part in this groundbreaking international endeavour,” says Richard Hydomako, a PhD student at the University of Calgary.

Trapping antimatter is tricky. When matter and antimatter get too close, they destroy each other, in a kind of explosion, leaving behind the energy which made them. The challenge is cooling the atoms off enough, 272 degrees below zero, so that they are slow enough to be trapped in a magnetic storage device.

“We've been able to trap about 38 atoms, which is an incredibly small amount, nothing like what we would need to power Star Trek's starship Enterprise or even to heat a cup of coffee,” says Thompson , one of 42 co-authors of the Nature paper along with the University of Calgary’s Makoto Fujiwara and graduate students Richard Hydomako and Tim Friesen.

“Now we can start working on the next step which is to use tools to study it,” adds Thompson.

The paper entitled Trapped Hydrogen is published in Nature
Source: Reprinted news release via University of Calgary

Related Posts:

NASA's Chandra Finds Youngest Nearby Black Hole

A supernova within the galaxy M100. Credits: X-ray: NASA/CXC/SAO/D.Patnaude et al, Optical: ESO/VLT, Infrared: NASA/JPL/Caltech
Astronomers using NASA's Chandra X-ray Observatory have found evidence of the youngest black hole known to exist in our cosmic neighborhood. The 30-year-old black hole provides a unique opportunity to watch this type of object develop from infancy.

The black hole could help scientists better understand how massive stars explode, which ones leave behind black holes or neutron stars, and the number of black holes in our galaxy and others.

The 30-year-old object is a remnant of SN 1979C, a supernova in the galaxy M100 approximately 50 million light years from Earth. Data from Chandra, NASA's Swift satellite, the European Space Agency's XMM-Newton and the German ROSAT observatory revealed a bright source of X-rays that has remained steady during observation from 1995 to 2007. This suggests the object is a black hole being fed either by material falling into it from the supernova or a binary companion.

"If our interpretation is correct, this is the nearest example where the birth of a black hole has been observed," said Daniel Patnaude of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Mass. who led the study.

The scientists think SN 1979C, first discovered by an amateur astronomer in 1979, formed when a star about 20 times more massive than the sun collapsed. Many new black holes in the distant universe previously have been detected in the form of gamma-ray bursts (GRBs). However, SN 1979C is different because it is much closer and belongs to a class of supernovas unlikely to be associated with a GRB. Theory predicts most black holes in the universe should form when the core of a star collapses and a GRB is not produced.

"This may be the first time the common way of making a black hole has been observed," said co-author Abraham Loeb, also of the Harvard-Smithsonian Center for Astrophysics. "However, it is very difficult to detect this type of black hole birth because decades of X-ray observations are needed to make the case."

The idea of a black hole with an observed age of only about 30 years is consistent with recent theoretical work. In 2005, a theory was presented that the bright optical light of this supernova was powered by a jet from a black hole that was unable to penetrate the hydrogen envelope of the star to form a GRB. The results seen in the observations of SN 1979C fit this theory very well.

Although the evidence points to a newly formed black hole in SN 1979C, another intriguing possibility is that a young, rapidly spinning neutron star with a powerful wind of high energy particles could be responsible for the X-ray emission. This would make the object in SN 1979C the youngest and brightest example of such a "pulsar wind nebula" and the youngest known neutron star. The Crab pulsar, the best-known example of a bright pulsar wind nebula, is about 950 years old.

"It's very rewarding to see how the commitment of some of the most advanced telescopes in space, like Chandra, can help complete the story," said Jon Morse, head of the Astrophysics Division at NASA's Science Mission Directorate.

The results will appear in the New Astronomy journal in a paper by Patnaude, Loeb, and Christine Jones of the Harvard-Smithsonian Center for Astrophysics. NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for the agency's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra's science and flight operations from Cambridge. 

Source: Reprinted news release via NASA

Related Posts:

Tuesday, November 16, 2010

In The Future There Is Sky Mowing [Video]

If you think mowing your lawn is hard work enough. Just think how hard it will be to mow the sky. In the future that's exactly what we will have to do. Mow the sky grass!

Okay okay, I admit, I'm being a little sarcastic. But you must see this video of a remote control flying lawnmower. It's a trip!

Up! Up! And moway!



[WatchMeSlapYourShit]

 Related Posts:

Astronomers Discover Merging Star Systems That Might Explode

The binary star system J0923+3028 consists of two white dwarfs: a visible star weighing 23 percent as much as our Sun and about four times the diameter of Earth, and an unseen companion weighing 44 percent of the Sun and about one Earth-diameter in size. The stars are currently separated by about 220,000 miles and orbit each other once per hour. The stars will spiral in toward each other and merge in about 100 million years.
Credit: Clayton Ellis (CfA)
Sometimes when you're looking for one thing, you find something completely different and unexpected. In the scientific endeavor, such serendipity can lead to new discoveries. Today, researchers who found the first hypervelocity stars escaping the Milky Way announced that their search also turned up a dozen double-star systems. Half of those are merging and might explode as supernovae in the astronomically near future.

All of the newfound binary stars consist of two white dwarfs. A white dwarf is the hot, dead core left over when a sun-like star gently puffs off its outer layers as it dies. A white dwarf is incredibly dense, packing as much as a sun's worth of material into a sphere the size of Earth. A teaspoon of it would weigh more than a ton.

"These are weird systems - objects the size of the Earth orbiting each other at a distance less than the radius of the Sun," said Smithsonian astronomer Warren Brown, lead author of the two papers reporting the find.

The white dwarfs found in this survey are lightweight among white dwarfs, holding only about one-fifth as much mass as the Sun. They are made almost entirely of helium, unlike normal white dwarfs made of carbon and oxygen.

"These white dwarfs have gone through a dramatic weight loss program," said Carlos Allende Prieto, an astronomer at the Instituto de Astrofisica de Canarias in Spain and a co-author of the study. "These stars are in such close orbits that tidal forces, like those swaying the oceans on Earth, led to huge mass losses."

Remarkably, because they whirl around so close to each other, the white dwarfs stir the space-time continuum, creating expanding ripples known as gravitational waves. Those waves carry away orbital energy, causing the stars to spiral closer together. Half of the systems are expected to merge eventually. The tightest binary, orbiting once every hour, will merge in about 100 million years.

"We have tripled the number of known, merging white-dwarf systems," said Smithsonian astronomer and co-author Mukremin Kilic. "Now, we can begin to understand how these systems form and what they may become in the near future."

When two white dwarfs merge, their combined mass can exceed a tipping point, causing them to detonate and explode as a Type Ia supernova. Brown and his colleagues suggest that the merging binaries they have discovered might be one source of underluminous supernovae -- a rare type of supernova explosion 100 times fainter than a normal Type Ia supernova, which ejects only one-fifth as much matter.

"The rate at which our white dwarfs are merging is the same as the rate of underluminous supernovae - about one every 2,000 years," explained Brown. "While we can't know for sure whether our merging white dwarfs will explode as underluminous supernovae, the fact that the rates are the same is highly suggestive."

Source: Reprinted news release via Harvard-Smithsonian Center for Astrophysics

Related Posts:

Japan's 3D Hologram Rock Star Hatsune Miku In Concert [Video]

Here is a video of Japan's 3D Hologram rock star Hatsune Miku performing in concert. This is probably one of the coolest things I've seen in awhile. The 3D hologram looks awesome and the music is fantastic! I totally just found my new crush!
 
Read about Hatsune Miku here.
Want more? Buy her video here.



Hatsune Miku was developed by Crypton Media using Yamaha’s Vocaloid voice synthesizer. She has had a number one hit single and continues to sell out concert after concert.
Source: SeyrenLK

Related Posts: