Showing posts with label Technology News. Show all posts
Showing posts with label Technology News. Show all posts

Sunday, February 06, 2011

Engineers Grow Nanolasers On Silicon, Pave Way For On-Chip Photonics

Berkeley – Engineers at the University of California, Berkeley, have found a way to grow nanolasers directly onto a silicon surface, an achievement that could lead to a new class of faster, more efficient microprocessors, as well as to powerful biochemical sensors that use optoelectronic chips.

They describe their work in a paper to be published Feb. 6 in an advanced online issue of the journal Nature Photonics.

"Our results impact a broad spectrum of scientific fields, including materials science, transistor technology, laser science, optoelectronics and optical physics," said the study's principal investigator, Connie Chang-Hasnain, UC Berkeley professor of electrical engineering and computer sciences.

The increasing performance demands of electronics have sent researchers in search of better ways to harness the inherent ability of light particles to carry far more data than electrical signals can. Optical interconnects are seen as a solution to overcoming the communications bottleneck within and between computer chips.

Because silicon, the material that forms the foundation of modern electronics, is extremely deficient at generating light, engineers have turned to another class of materials known as III-V (pronounced "three-five") semiconductors to create light-based components such as light-emitting diodes (LEDs) and lasers.

But the researchers pointed out that marrying III-V with silicon to create a single optoelectronic chip has been problematic. For one, the atomic structures of the two materials are mismatched.

"Growing III-V semiconductor films on silicon is like forcing two incongruent puzzle pieces together," said study lead author Roger Chen, a UC Berkeley graduate student in electrical engineering and computer sciences. "It can be done, but the material gets damaged in the process."

Moreover, the manufacturing industry is set up for the production of silicon-based materials, so for practical reasons, the goal has been to integrate the fabrication of III-V devices into the existing infrastructure, the researchers said.

"Today's massive silicon electronics infrastructure is extremely difficult to change for both economic and technological reasons, so compatibility with silicon fabrication is critical," said Chang-Hasnain. "One problem is that growth of III-V semiconductors has traditionally involved high temperatures – 700 degrees Celsius or more – that would destroy the electronics. Meanwhile, other integration approaches have not been scalable."

The UC Berkeley researchers overcame this limitation by finding a way to grow nanopillars made of indium gallium arsenide, a III-V material, onto a silicon surface at the relatively cool temperature of 400 degrees Celsius.

"Working at nanoscale levels has enabled us to grow high quality III-V materials at low temperatures such that silicon electronics can retain their functionality," said Chen.

The researchers used metal-organic chemical vapor deposition to grow the nanopillars on the silicon. "This technique is potentially mass manufacturable, since such a system is already used commercially to make thin film solar cells and light emitting diodes," said Chang-Hasnain.

Once the nanopillar was made, the researchers showed that it could generate near infrared laser light – a wavelength of about 950 nanometers – at room temperature. The hexagonal geometry dictated by the crystal structure of the nanopillars creates a new, efficient, light-trapping optical cavity. Light circulates up and down the structure in a helical fashion and amplifies via this optical feedback mechanism.

The unique approach of growing nanolasers directly onto silicon could lead to highly efficient silicon photonics, the researchers said. They noted that the miniscule dimensions of the nanopillars – smaller than one wavelength on each side, in some cases – make it possible to pack them into small spaces with the added benefit of consuming very little energy

"Ultimately, this technique may provide a powerful and new avenue for engineering on-chip nanophotonic devices such as lasers, photodetectors, modulators and solar cells," said Chen.

"This is the first bottom-up integration of III-V nanolasers onto silicon chips using a growth process compatible with the CMOS (complementary metal oxide semiconductor) technology now used to make integrated circuits," said Chang-Hasnain. "This research has the potential to catalyze an optoelectronics revolution in computing, communications, displays and optical signal processing. In the future, we expect to improve the characteristics of these lasers and ultimately control them electronically for a powerful marriage between photonic and electronic devices."

Source: Reprinted news release via University of California - Berkeley

Saturday, February 05, 2011

New Nanomaterials Unlock New Electronic And Energy Technologies

A new way of splitting layered materials to give atom thin "nanosheets" has been discovered. This has led to a range of novel two-dimensional nanomaterials with chemical and electronic properties that have the potential to enable new electronic and energy storage technologies. The collaborative* international research led by the Centre for Research on Adaptive Nanostructures and Nanodevices (CRANN), Trinity College Dublin, Ireland, and the University of Oxford has been published in this week's Science.

The scientists have invented a versatile method for creating these atom thin nanosheets from a range of materials using common solvents and ultrasound, utilising devices similar to those used to clean jewellery. The new method is simple, fast, and inexpensive, and could be scaled up to work on an industrial scale.

"Of the many possible applications of these new nanosheets, perhaps the most important are as thermoelectric materials. These materials, when fabricated into devices, can generate electricity from waste heat. For example, in gas-fired power plants approximately 50% of energy produced is lost as waste heat while for coal and oil plants the figure is up to 70%. However, the development of efficient thermoelectric devices would allow some of this waste heat to be recycled cheaply and easily, something that has been beyond us, up until now," explained Professor Jonathan Coleman, Principal Investigator at CRANN and the School of Physics, Trinity College Dublin who led the research along with Dr Valeria Nicolosi in the Department of Materials at the University of Oxford.

This research can be compared to the work regarding the two-dimensional material graphene, which won the Nobel Prize in 2010. Graphene has generated significant interest because when separated into individual flakes, it has exceptional electronic and mechanical properties that are very different to those of its parent crystal, graphite. However, graphite is just one of hundreds of layered materials, some of which may enable powerful new technologies.

Coleman's work will open up over 150 similarly exotic layered materials – such as Boron Nitride, Molybdenum disulfide, and Bismuth telluride – that have the potential to be metallic, semiconducting or insulating, depending on their chemical composition and how their atoms are arranged. This new family of materials opens a whole range of new "super" materials.

For decades researchers have tried to create nanosheets from layered materials in order to unlock their unusual electronic and thermoelectric properties. However, previous methods were time consuming, laborious or of very low yield and so unsuited to most applications.

"Our new method offers low-costs, a very high yield and a very large throughput: within a couple of hours, and with just 1 mg of material, billions and billions of one-atom-thick nanosheets can be made at the same time from a wide variety of exotic layered materials," explained Dr Nicolosi, from the University of Oxford.

These new materials are also suited for use in next generation batteries – "supercapacitors" – which can deliver energy thousands of times faster than standard batteries, enabling new applications such as electric cars. Many of these new atomic layered materials are very strong and can be added to plastics to produce super-strong composites. These will be useful in a range of industries from simple structural plastics to aeronautics.

Source: Reprinted news release via Trinity College Dublin

Future Surgeons May Use Robotic Nurse, 'Gesture Recognition'

Surgeons of the future might use a system that recognizes hand gestures as commands to control a robotic scrub nurse or tell a computer to display medical images of the patient during an operation.

Both the hand-gesture recognition and robotic nurse innovations might help to reduce the length of surgeries and the potential for infection, said Juan Pablo Wachs, an assistant professor of industrial engineering at Purdue University.

The "vision-based hand gesture recognition" technology could have other applications, including the coordination of emergency response activities during disasters.

"It's a concept Tom Cruise demonstrated vividly in the film 'Minority Report,'" Wachs said.

Surgeons routinely need to review medical images and records during surgery, but stepping away from the operating table and touching a keyboard and mouse can delay the surgery and increase the risk of spreading infection-causing bacteria.

The new approach is a system that uses a camera and specialized algorithms to recognize hand gestures as commands to instruct a computer or robot.

At the same time, a robotic scrub nurse represents a potential new tool that might improve operating-room efficiency, Wachs said.

Findings from the research will be detailed in a paper appearing in the February issue of Communications of the ACM, the flagship publication of the Association for Computing Machinery. The paper, featured on the journal's cover, was written by researchers at Purdue, the Naval Postgraduate School in Monterey, Calif., and Ben-Gurion University of the Negev, Israel.

Research into hand-gesture recognition began several years ago in work led by the Washington Hospital Center and Ben-Gurion University, where Wachs was a research fellow and doctoral student, respectively.

He is now working to extend the system's capabilities in research with Purdue's School of Veterinary Medicine and the Department of Speech, Language, and Hearing Sciences.

"One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions," Wachs said. "You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use."

Other challenges include providing computers with the ability to understand the context in which gestures are made and to discriminate between intended gestures versus unintended gestures.

"Say the surgeon starts talking to another person in the operating room and makes conversational gestures," Wachs said. "You don't want the robot handing the surgeon a hemostat."

A scrub nurse assists the surgeon and hands the proper surgical instruments to the doctor when needed.

"While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room," Wachs said. "In that case, a robotic scrub nurse could be better."

The Purdue researcher has developed a prototype robotic scrub nurse, in work with faculty in the university's School of Veterinary Medicine.

Researchers at other institutions developing robotic scrub nurses have focused on voice recognition. However, little work has been done in the area of gesture recognition, Wachs said.

"Another big difference between our focus and the others is that we are also working on prediction, to anticipate what images the surgeon will need to see next and what instruments will be needed," he said.

Wachs is developing advanced algorithms that isolate the hands and apply "anthropometry," or predicting the position of the hands based on knowledge of where the surgeon's head is. The tracking is achieved through a camera mounted over the screen used for visualization of images.

"Another contribution is that by tracking a surgical instrument inside the patient's body, we can predict the most likely area that the surgeon may want to inspect using the electronic image medical record, and therefore saving browsing time between the images," Wachs said. "This is done using a different sensor mounted over the surgical lights."

The hand-gesture recognition system uses a new type of camera developed by Microsoft, called Kinect, which senses three-dimensional space. The camera is found in new consumer electronics games that can track a person's hands without the use of a wand.

"You just step into the operating room, and automatically your body is mapped in 3-D," he said.

Accuracy and gesture-recognition speed depend on advanced software algorithms.

"Even if you have the best camera, you have to know how to program the camera, how to use the images," Wachs said. "Otherwise, the system will work very slowly."

The research paper defines a set of requirements, including recommendations that the system should:

* Use a small vocabulary of simple, easily recognizable gestures.
* Not require the user to wear special virtual reality gloves or certain types of clothing.
* Be as low-cost as possible.
* Be responsive and able to keep up with the speed of a surgeon's hand gestures.
* Let the user know whether it understands the hand gestures by providing feedback, perhaps just a simple "OK."
* Use gestures that are easy for surgeons to learn, remember and carry out with little physical exertion.
* Be highly accurate in recognizing hand gestures.
* Use intuitive gestures, such as two fingers held apart to mimic a pair of scissors.
* Be able to disregard unintended gestures by the surgeon, perhaps made in conversation with colleagues in the operating room.
* Be able to quickly configure itself to work properly in different operating rooms, under various lighting conditions and other criteria.

"Eventually we also want to integrate voice recognition, but the biggest challenges are in gesture recognition," Wachs said. "Much is already known about voice recognition."

Image: Robotic nurse. Credit: Purdue University photo/Mark Simons

Source: Reprinted news release via Purdue University

Friday, February 04, 2011

Robonaut 2 To Make Television Debut on Super Bowl Sunday

Robonaut 2. Credit: NASA
Robonaut 2, NASA's dexterous humanoid robot, will make its television debut on Super Bowl Sunday, Feb. 6, 2011. Millions of viewers will be able to watch the state-of-the-art robot during a General Motors segment to air during the Super Bowl pre-game show on the Fox network.

Robonaut 2, or R2, was developed and built by NASA and General Motors via a Space Act Agreement. Using the latest technology, it's a new humanoid robot capable of working side-by-side with people. Using leading edge control, sensor and vision technologies, future R-2s could assist astronauts during hazardous space missions and help GM build safer cars and plants.

The two organizations, with the help of engineers from Oceaneering Space Systems of Houston, developed and built the current iteration of Robonaut. Robonaut 2, or R2, is a faster, more dexterous and more technologically advanced robot. Its capabilities include the use of fully-functional hands and arms to do work beyond the scope of prior humanoid machines.

Like its predecessor Robonaut 1, R2 is capable of handling a wide range of tools and interfaces, but R2 is a significant advancement over its predecessor. R2 is capable of speeds more than four times faster than R1, is more compact, is more dexterous, and includes a deeper and wider range of sensing.

Advanced technology spans the entire R2 system and includes: optimized overlapping dual arm dexterous workspace, series elastic joint technology, extended finger and thumb travel, miniaturized 6-axis load cells, redundant force sensing, ultra-high speed joint controllers, extreme neck travel, and high resolution camera and IR systems. The dexterity of R2 allows it to use the same tools that astronauts use and removes the need for specialized tools just for robots.

One advantage of a humanoid design is that Robonaut can take over simple, repetitive, or especially dangerous tasks on places such as the International Space Station.

Source: Reprinted news release via NASA

New Approach To Solar Cells

An interdisciplinary team of UC Davis and UC Santa Cruz researchers is taking a novel approach to solar power, one that promises to lead to a technological breakthrough. By using nanoparticles of germanium, silicon and other materials, the researchers hope to produce solar cells far more efficient than the current state of the art.

The project was recently awarded $1.5 million over three years from the National Science Foundation.

Conventional solar cells all operate on the same principle of "one photon in, one electron out," said Gergely Zimanyi, professor of physics at UC Davis and principal investigator on the NSF grant. In other words, one particle of light, or photon, hits the solar cell and generates one electron to produce an electrical current.

The efficiency — energy out compared to energy in — of a solar cell operating according to this principle is capped by a theoretical maximum of 31 percent. But by constructing solar cells from extremely small nanoparticles, the UC researchers aim to generate several electrons for each photon, raising the maximum efficiency to between 42 and 65 percent.

The one-photon-in/multiple-electrons-out paradigm has been demonstrated at the Los Alamos National Laboratory, Zimanyi said — but the Los Alamos group did not build a functioning solar cell based on this paradigm. The UC Davis/UC Santa Cruz team includes scientists with experience making solar cells from nanoparticles, giving hope that the group will be able to construct a fully functioning and well-optimized solar cell from germanium and silicon nanoparticles, he said.

The team members are: Zimanyi; UC Davis chemistry professors Susan Kauzlarich and Delmar Larsen; Professor Giulia Galli, who holds a joint appointment in physics and chemistry; Professor Zhaojun Bai, Department of Mathematics and Computer Science; Debashis Paul, professor in the Department of Statistics; and Susan Carter, professor of physics at UC Santa Cruz.

The interdisciplinary nature of the team was crucial to getting the proposal funded, Zimanyi said. "NSF asked for a collaborative effort between materials sciences, chemistry and mathematical sciences," he said.

Zimanyi, Galli and Bai will conduct theoretical and computer-modeling studies, with Paul providing statistical expertise; Kauzlarich's lab will synthesize the new nanoparticles, Larsen's group will characterize them and Carter's lab at UCSC will develop a working device. A prototype cell has been already constructed prior to getting the grant and exhibited an efficiency of about 8 percent, which Zimanyi described as a very encouraging result given the limited resources going into its construction.

The team will collaborate with the California Solar Energy Collaborative, which is based at UC Davis and led by Pieter Stroeve, professor of chemical engineering and materials science. The team also plans an outreach effort, primarily via its public webpage: http://www.solarwiki.ucdavis.edu/.

Source: Reprinted news release via University of California - Davis

Thursday, January 27, 2011

China Builds 15 Story Hotel In 6 Days [Video]

Here is a video of a hotel being built in just 6 days! The Chinese really have construction down to a science!

If you think this video is cool, check these out:
The Big Bang In A Light Bulb
How To Make A Star Trek Style Door
The Coolest Homemade Lightsaber
Cereal Boxes That Light Up




Via Singularityhub

Friday, January 14, 2011

Yikes! It's a YikeBike, The Future Of Personal Transportation

Oh my future! It's a YikeBike, a futurist looking electric bike that is designed to get you from point A to point B. Did I mention that it is foldable?

With a 6 mile range, a 15mph top speed, and a $3,500 price tag, it ain't cheap, it ain't fast, and it doesn't go far, but it's still the coolest looking bike on the block!

Now, this isn't the bike to just jump on and ride into the sunset, it takes a bit of a learning curve and some practice before trying it out on the streets. The last thing you want to do is pay $3,500 bucks to just crash the thing head on into an innocent pedestrian. Yikes, I wonder if that is why they call it the "YikeBike? Who knows. The only other thing I have to say is that if you like unicycles, then the YikeBike is like a super futuristic unicycle sure to get you from point A to point B. Will it replace the Segway? It's possible, at the least it could offer up some competition.

Here is a promo video for the YikeBike. For more information visit yikebike.com

Thursday, January 13, 2011

Stanford researcher uses living cells to create 'biotic' video games (w/ Video)

(PhysOrg.com) -- The digital revolution has triggered a wild proliferation of video games, but what of the revolution in biotechnology? Does it have the potential to spawn its own brood of games? Stanford physicist Ingmar Riedel-Kruse has begun developing "biotic games" involving paramecia and other living organisms. He hopes the games lead to advances in education and crowd-sourcing of laboratory research while helping to raise the level of public discourse on bio-related issues.

Read more -- Stanford researcher uses living cells to create 'biotic' video games (w/ Video)

Wednesday, January 12, 2011

The 'Spaser' Heats Up Laser Technology

Tel Aviv University develops a groundbreaking nano-laser for medicine and electronics

Lasers have revolutionized the communications and medical industries. They focus light to zap tumors and send digital TV signals and telephone communications around the world.

But the physical length of an ordinary laser cannot be less than one half of the wavelength of its light, which limits its application in many industries. Now the Spaser, a new invention developed in part by Tel Aviv University, can be as small as needed to fuel nano-technologies of the future.

Prof. David Bergman of Tel Aviv University's Department of Physics and Astronomy developed and patented the theory behind the Spaser device in 2003 with Prof. Mark Stockman of Georgia State University in Atlanta. It is now being developed into a practical tool by research teams in the United States and around the world.

"Spaser" is an acronym for "surface plasmon amplification by stimulated emission of radiation" ― and despite its mouthfilling definition, it's a number one buzzword in the nanotechnologies industry. The Spaser has been presented at recent meetings and symposia around the world, including a recent European Optical Society Annual Meeting.

Seeing your DNA up close

Spasers are considered a critical component for future technologies based on nanophotonics ––technologies that could lead to radical innovations in medicine and science, such as a sensor and microscope 10 times more powerful than anything used today. A Spaser-based microscope might be so sensitive that it could see genetic base pairs in DNA.

It could also lead to computers and electronics that operate at speeds 100 times greater than today's devices, using light instead of electrons to communicate and compute. More efficient solar energy collectors in renewable energy are another proposed application.

"It rhymes with laser, but our Spaser is different," says Prof. Bergman, who owns the Spaser patent with his American partner. "Based on pure physics, it's like a laser, but much, much, much smaller." The Spaser uses surface plasma waves, whose wavelength can be much smaller than that of the light it produces. That's why a Spaser can be less than 100 nanometers, or one-tenth of a micron, long. This is much less than the wavelength of visible light, explains Prof. Bergman.

Fuelling the buzz

In the next year, the research team expects even more buzz to be created around their invention. In 2009, a team from Norfolk State University, Purdue University, and Cornell University managed to create a practical prototype.

The Spaser will extend the range of what's possible in modern electronics and optical devices, well beyond today's computer chips and memories, Prof. Bergman believes. The physical limitations of current materials are overcome in the Spaser because it uses plasmons, and not photons. With the development of surface plasma waves ― electromagnetic waves combined with an electron fluid wave in a metal ― future nano-devices will operate photonic circuitry on the surface of a metal. But a source of those waves will be needed. That's where the Spaser comes in.

Smaller than the wavelength of light, nano-sized plasmonic devices will be fast and small. Currently the research team is working on commercializing their invention, which they suggest could represent a quantum leap in the development of nano-sized devices.

Source: Reprinted news release via American Friends of Tel Aviv University

Friday, January 07, 2011

Cereal Boxes That Light-Up [Video]

Last year at CES 2011, I noticed these awesome cereal boxes that are infused with printable electronics that allows them to light-up. As the awesomeness started to settle in, I realized that out of many of the other technologies and gadgets featured at the show, these cereal boxes were my favorite. I absolutely refuse to buy any more cereal until these hit the store shelves!



Via Neowin

Thursday, January 06, 2011

NASA Tests New Propulsion System For Robotic Lander Prototype

The robotic lander prototype's propulsion system, shown during a hot-fire test. Credit: Dynetics Corp.
NASA's Robotic Lunar Lander Development Project at Marshall Space Flight Center in Huntsville, Ala., has completed a series of hot fire tests and taken delivery of a new propulsion system for integration into a more sophisticated free-flying autonomous robotic lander prototype. The project is partnered with the Johns Hopkins University Applied Physics Laboratory in Laurel, Md., to develop a new generation of small, smart, versatile robotic landers to achieve scientific and exploration goals on the surface of the moon and near-Earth asteroids.

The new robotic lander prototype will continue to mature the development of a robotic lander capability by bringing online an autonomous flying test lander that will be capable of flying up to sixty seconds, testing the guidance, navigation and control system by demonstrating a controlled landing in a simulated low gravity environment.

By the spring of 2011, the new prototype lander will begin flight tests at the U.S. Army's Redstone Arsenal Test Center in Huntsville, Ala.

The prototype’s new propulsion system consists of 12 small attitude control thrusters, three primary descent thrusters to control the vehicle’s altitude, and one large "gravity-canceling" thruster which offsets a portion of the prototype’s weight to simulate a lower gravity environment, like that of the moon and asteroids. The prototype uses a green propellant, hydrogen peroxide, in a stronger concentration of a solution commonly used in homes as a disinfectant. The by-products after use are water and oxygen.

"The propulsion hardware acceptance test consisted of a series of tests that verified the performance of each thruster in the propulsion system," said Julie Bassler, Robotic Lunar Lander Development Project Manager. "The series culminated in a test that characterized the entire system by running a scripted set of thruster firings based on a flight scenario simulation."

The propulsion system is currently at Teledyne Brown’s manufacturing facility in Huntsville, Ala., for integration with the structure and avionics to complete the new robotic lander prototype. Dynetics Corp. developed the robotic lander prototype propulsion system under the management of the Von Braun Center for Science and Innovation both located in Huntsville, Ala.

"This is the second phase of a robotic lander prototype development program," said Bassler. "Our initial "cold gas" prototype was built, delivered and successfully flight tested at the Marshall Center in a record nine months, providing a physical and tangible demonstration of capabilities related to the critical terminal descent and landing phases for an airless body mission."

The first robotic lander prototype has a record flight time of ten seconds and descended from three meters altitude. This first robotic lander prototype began flight tests in September 2009 and has completed 142 flight tests, providing a platform to develop and test algorithms, sensors, avionics, ground and flight software and ground systems to support autonomous landings on airless bodies, where aero-braking and parachutes are not options.

For more photos of the hardware visit:
http://www.nasa.gov/roboticlander


For more information about NASA visit:
http://www.nasa.gov

CES 2011 Live Streaming Video Via Crunchgear

The Consumer Electronics Show (CES) 2011 has taken over Las Vegas! Can't make it? No problemo! CES 2011 is live streaming via Crunchgear. The video is below. Don't miss a thing!

Wednesday, January 05, 2011

Newly Developed Cloak Hides Underwater Objects From Sonar

Illinois researchers designed a two-dimensional cylindrical cloak made of 16 concentric rings of acoustic circuits structured to guide sound waves. Each ring has a different index of refraction, meaning that sound waves vary their speed from the outer rings to the inner ones. Credit: L. Brian Stauffer
In one University of Illinois lab, invisibility is a matter of now you hear it, now you don't.


Led by mechanical science and engineering professor Nicholas Fang, Illinois researchers have demonstrated an acoustic cloak, a technology that renders underwater objects invisible to sonar and other ultrasound waves.

"We are not talking about science fiction. We are talking about controlling sound waves by bending and twisting them in a designer space," said Fang, who also is affiliated with the Beckman Institute for Advanced Science and Technology. "This is certainly not some trick Harry Potter is playing with."

While materials that can wrap sound around an object rather than reflecting or absorbing it have been theoretically possible for a few years, realization of the concept has been a challenge. In a paper accepted for publication in the journal Physical Review Letters, Fang's team describe their working prototype, capable of hiding an object from a broad range of sound waves.

The cloak is made of metamaterial, a class of artificial materials that have enhanced properties as a result of their carefully engineered structure. Fang's team designed a two-dimensional cylindrical cloak made of 16 concentric rings of acoustic circuits structured to guide sound waves. Each ring has a different index of refraction, meaning that sound waves vary their speed from the outer rings to the inner ones.

"Basically what you are looking at is an array of cavities that are connected by channels. The sound is going to propagate inside those channels, and the cavities are designed to slow the waves down," Fang said. "As you go further inside the rings, sound waves gain faster and faster speed."

Since speeding up requires energy, the sound waves instead propagate around the cloak's outer rings, guided by the channels in the circuits. The specially structured acoustic circuits actually bend the sound waves to wrap them around the outer layers of the cloak.

The researchers tested their cloak's ability to hide a steel cylinder. They submerged the cylinder in a tank with an ultrasound source on one side and a sensor array on the other, then placed the cylinder inside the cloak and watched it disappear from their sonar.

Curious to see if the hidden object's structure played a role in the cloaking phenomenon, the researchers conducted trials with other objects of various shapes and densities.

"The structure of what you're trying to hide doesn't matter," Fang said. "The effect is similar. After we placed the cloaked structure around the object we wanted to hide, the scattering or shadow effect was greatly reduced."

An advantage of the acoustic cloak is its ability to cover a broad range of sound wavelengths. The cloak offers acoustic invisibility to ultrasound waves from 40 to 80 KHz, although with modification could theoretically be tuned to cover tens of megahertz.

"This is not just a single wavelength effect. You don't have an invisible cloak that's showing up just by switching the frequencies slightly," Fang said. "The geometry is not theoretically scaled with wavelengths. The nice thing about the circuit element approach is that you can scale the channels down while maintaining the same wave propagation technology."

Next, the researchers plan to explore how the cloaking technology could influence applications from military stealth to soundproofing to health care. For example, ultrasound and other acoustic imaging techniques are common in medical practice, but many things in the body can cause interference and mar the image. A metamaterial bandage or shield could effectively hide a troublesome area so the scanner could focus on the region of interest.

The cloaking technology also may affect nonlinear acoustic phenomena. One problem plaguing fast-moving underwater objects is cavitation, or the formation and implosion of bubbles. Fang and his group believe that they could harness their cloak's abilities to balance energy in cavitation-causing areas, such as the vortex around a propeller.

Source: Reprinted news release via University of Illinois at Urbana-Champaign

Tuesday, January 04, 2011

New Solar Cell Self-Repairs Like Natural Plant Systems

Researchers are creating a new type of solar cell designed to self-repair like natural photosynthetic systems in plants by using carbon nanotubes and DNA, an approach aimed at increasing service life and reducing cost.

"We've created artificial photosystems using optical nanomaterials to harvest solar energy that is converted to electrical power," said Jong Hyun Choi, an assistant professor of mechanical engineering at Purdue University.

The design exploits the unusual electrical properties of structures called single-wall carbon nanotubes, using them as "molecular wires in light harvesting cells," said Choi, whose research group is based at the Birck Nanotechnology and Bindley Bioscience centers at Purdue's Discovery Park.

"I think our approach offers promise for industrialization, but we're still in the basic research stage," he said.

Photoelectrochemical cells convert sunlight into electricity and use an electrolyte - a liquid that conducts electricity - to transport electrons and create the current. The cells contain light-absorbing dyes called chromophores, chlorophyll-like molecules that degrade due to exposure to sunlight.

"The critical disadvantage of conventional photoelectrochemical cells is this degradation," Choi said.

The new technology overcomes this problem just as nature does: by continuously replacing the photo-damaged dyes with new ones.

"This sort of self-regeneration is done in plants every hour," Choi said.

The new concept could make possible an innovative type of photoelectrochemical cell that continues operating at full capacity indefinitely, as long as new chromophores are added.

Findings were detailed in a November presentation during the International Mechanical Engineering Congress and Exhibition in Vancouver. The concept also was unveiled in an online article (http://spie.org/x41475.xml?ArticleID=x41475) featured on the Web site for SPIE, an international society for optics and photonics.

The talk and article were written by Choi, doctoral students Benjamin A. Baker and Tae-Gon Cha, and undergraduate students M. Dane Sauffer and Yujun Wu.

The carbon nanotubes work as a platform to anchor strands of DNA. The DNA is engineered to have specific sequences of building blocks called nucleotides, enabling them to recognize and attach to the chromophores.

"The DNA recognizes the dye molecules, and then the system spontaneously self-assembles," Choi said

When the chromophores are ready to be replaced, they might be removed by using chemical processes or by adding new DNA strands with different nucleotide sequences, kicking off the damaged dye molecules. New chromophores would then be added.

Two elements are critical for the technology to mimic nature's self-repair mechanism: molecular recognition and thermodynamic metastability, or the ability of the system to continuously be dissolved and reassembled.

The research is an extension of work that Choi collaborated on with researchers at the Massachusetts Institute of Technology and the University of Illinois. The earlier work used biological chromophores taken from bacteria, and findings were detailed in a research paper published in November in the journal Nature Chemistry (http://www.nature.com/nchem/journal/v2/n11/abs/nchem.822.html).

However, using natural chromophores is difficult, and they must be harvested and isolated from bacteria, a process that would be expensive to reproduce on an industrial scale, Choi said.

"So instead of using biological chromophores, we want to use synthetic ones made of dyes called porphyrins," he said.

Image: Jong Hyun Choi, an assistant professor of mechanical engineering at Purdue, and doctoral student Benjamin Baker use fluorescent imaging to view a carbon nanotube. Their research is aimed at creating a new type of solar cell designed to self-repair like natural photosynthetic systems. The approach might enable researchers to increase the service life and reduce costs for photoelectrochemical cells, which convert sunlight into electricity.
Credit: Purdue University photo/Mark Simons

Source: Reprinted news release via Purdue University

Saturday, December 18, 2010

NASA Test Fires AJ26 Engine For The Taurus II Space Vehicle

Test firing of the engine on Dec. 17, 2010.
Image Credit: NASA
How long is a minute? It is longer than you think when it is filled with fire, steam and noise – lots of noise.

On Dec. 17, at NASA's John C. Stennis Space Center, a team of operators from Stennis, Orbital Sciences Corporation and Aerojet filled 55 seconds with all three during the second verification test fire of an Aerojet AJ26 rocket engine. Once verified, the engine will be placed on a Taurus II space vehicle and used to launch a cargo supply mission to the International Space Station.

It is all part of NASA’s effort to partner with commercial companies to provide space flights through the Commercial Orbital Transportation Services joint research and development project. Through that program, Orbital has agreed to provide eight cargo supply missions to the space station by 2015. Stennis has partnered with Orbital to test the engines that will power the missions.

So, when Orbital’s Taurus II space vehicle lifts off, it will do so on engines proven flight worthy at Stennis. That is a big responsibility, but it is one which engine test personnel at Stennis are used to filling. They tested engines for every manned Apollo space flight and all of the engines used on more than 130 space shuttle missions.

Source: Reprinted news release via NASA

Friday, December 17, 2010

First Measurement Of Magnetic Field In Earth's Core

A University of California, Berkeley, geophysicist has made the first-ever measurement of the strength of the magnetic field inside Earth's core, 1,800 miles underground.

The magnetic field strength is 25 Gauss, or 50 times stronger than the magnetic field at the surface that makes compass needles align north-south. Though this number is in the middle of the range geophysicists predict, it puts constraints on the identity of the heat sources in the core that keep the internal dynamo running to maintain this magnetic field.

"This is the first really good number we've had based on observations, not inference," said author Bruce A. Buffett, professor of earth and planetary science at UC Berkeley. "The result is not controversial, but it does rule out a very weak magnetic field and argues against a very strong field."

The results are published in the Dec. 16 issue of the journal Nature.

A strong magnetic field inside the outer core means there is a lot of convection and thus a lot of heat being produced, which scientists would need to account for, Buffett said. The presumed sources of energy are the residual heat from 4 billion years ago when the planet was hot and molten, release of gravitational energy as heavy elements sink to the bottom of the liquid core, and radioactive decay of long-lived elements such as potassium, uranium and thorium.

A weak field – 5 Gauss, for example – would imply that little heat is being supplied by radioactive decay, while a strong field, on the order of 100 Gauss, would imply a large contribution from radioactive decay.

"A measurement of the magnetic field tells us what the energy requirements are and what the sources of heat are," Buffett said.

About 60 percent of the power generated inside the earth likely comes from the exclusion of light elements from the solid inner core as it freezes and grows, he said. This constantly builds up crud in the outer core.

The Earth's magnetic field is produced in the outer two-thirds of the planet's iron/nickel core. This outer core, about 1,400 miles thick, is liquid, while the inner core is a frozen iron and nickel wrecking ball with a radius of about 800 miles – roughly the size of the moon. The core is surrounded by a hot, gooey mantle and a rigid surface crust.

The cooling Earth originally captured its magnetic field from the planetary disk in which the solar system formed. That field would have disappeared within 10,000 years if not for the planet's internal dynamo, which regenerates the field thanks to heat produced inside the planet. The heat makes the liquid outer core boil, or "convect," and as the conducting metals rise and then sink through the existing magnetic field, they create electrical currents that maintain the magnetic field. This roiling dynamo produces a slowly shifting magnetic field at the surface.

"You get changes in the surface magnetic field that look a lot like gyres and flows in the oceans and the atmosphere, but these are being driven by fluid flow in the outer core," Buffett said.

Buffett is a theoretician who uses observations to improve computer models of the earth's internal dynamo. Now at work on a second generation model, he admits that a lack of information about conditions in the earth's interior has been a big hindrance to making accurate models.

He realized, however, that the tug of the moon on the tilt of the earth's spin axis could provide information about the magnetic field inside. This tug would make the inner core precess – that is, make the spin axis slowly rotate in the opposite direction – which would produce magnetic changes in the outer core that damp the precession. Radio observations of distant quasars – extremely bright, active galaxies – provide very precise measurements of the changes in the earth's rotation axis needed to calculate this damping.

"The moon is continually forcing the rotation axis of the core to precess, and we're looking at the response of the fluid outer core to the precession of the inner core," he said.

By calculating the effect of the moon on the spinning inner core, Buffett discovered that the precession makes the slightly out-of-round inner core generate shear waves in the liquid outer core. These waves of molten iron and nickel move within a tight cone only 30 to 40 meters thick, interacting with the magnetic field to produce an electric current that heats the liquid. This serves to damp the precession of the rotation axis. The damping causes the precession to lag behind the moon as it orbits the earth. A measurement of the lag allowed Buffett to calculate the magnitude of the damping and thus of the magnetic field inside the outer core.

Buffett noted that the calculated field – 25 Gauss – is an average over the entire outer core. The field is expected to vary with position.

"I still find it remarkable that we can look to distant quasars to get insights into the deep interior of our planet," Buffett said.

Source: Reprinted news release via University of California - Berkeley

Saturday, November 20, 2010

New Microscope Reveals Ultrastructure Of Cells

Slice through the nucleus of a mouse adenocarcinoma cell
showing the nucleolus (NU) and the membrane channels running
across the nucleus (NMC); taken by X-ray nanotomography.
Photo: HZB/Schneider
For the first time, there is no need to chemically fix, stain or cut cells in order to study them. Instead, whole living cells are fast-frozen and studied in their natural environment. The new method delivers an immediate 3-D image, thereby closing a gap between conventional microscopic techniques.

The new microscope delivers a high-resolution 3-D image of the entire cell in one step. This is an advantage over electron microscopy, in which a 3-D image is ass
embled out of many thin sections. This can take up to weeks for just one cell. Also, the cell need not be labelled with dyes, unlike in fluorescence microscopy, where only the labelled structures become visible. The new X-ray microscope instead exploits the natural contrast between organic material and water to form an image of all cell structures. Dr. Gerd Schneider and his microscopy team at the Institute for Soft Matter and Functional Materials have published their development in Nature Methods (DOI:10.1038/nmeth.1533).

With the high resolution achieved by their microscope, the researchers, in cooperation with colleagues of the National Cancer Institute in the USA, have reconstructed mouse adenocarcinoma cells in three dimensions. The smallest of details were visible: the double membrane of the cell nucleus, nuclear pores in the nuclear envelope, membrane channels in the nucleus, numerous inva­ginations of the inner mitochondrial membrane and inclusions in cell organelles such as lysosomes. Such insights will be crucial for shedding light on inner-cellular processes: such as how viruses or nanoparticles penetrate into cells or into the nucleus, for example.

This is the first time the so-called ultrastructure of cells has been imaged with X-rays to such precision, down to 30 nanometres. Ten nanometres are about one ten-thousandth of the width of a human hair. Ultrastructure is the detailed structure of a biological specimen that is too small to be seen with an optical microscope.

Researchers achieved this high 3-D resolution by illuminating the minute structures of the frozen-hydrated object with partially coherent light. This light is generated by BESSY II, the synchrotron source at HZB. Partial coherence is the property of two waves whose relative phase undergoes random fluctuations which are not, however, sufficient to make the wave completely incoherent. Illumination with partial coherent light generates significantly higher contrast for small object details compared to incoherent illumination. Combining this approach with a high-resolution lens, the researchers were able to visualize the ultrastructures of cells at hitherto unattained contrast.

The new X-ray microscope also allows for more space around the sample, which leads to a better spatial view. This space has always been greatly limited by the setup for the sample illumination. The required monochromatic X-ray light was created using a radial grid and then, from this light, a diaphragm would select the desired range of wavelengths. The diaphragm had to be placed so close to the sample that there was almost no space to turn the sample around. The researchers modified this setup: Monochromatic light is collected by a new type of condenser which directly illuminates the object, and the diaphragm is no longer needed. This allows the sample to be turned by up to 158 degrees and observed in three dimensions. These developments provide a new tool in structural biology for the better understanding of the cell structure.

Source: Reprinted news release via Helmholtz Association of German Research Centres

Related Stories:

Thursday, November 18, 2010

Magnetic Trapping Will Help Unlock The Secrets Of Anti-Matter

A clearer understanding of the Universe, its origins and maybe even its destiny is a significant step closer, thanks to new research.

As part of a major international experiment called ALPHA*, based at CERN in Switzerland, researchers have helped to achieve trapping and holding atoms of 'anti-hydrogen', which has not previously been possible.

The project involves physicists at Swansea University led by Professor Mike Charlton, Dr Niels Madsen and Dr Dirk Peter van der Werf and the University of Liverpool under Professor Paul Nolan, all supported by the Engineering and Physical Sciences Research Council (EPSRC).

This breakthrough will make it possible to study 'anti-matter' closely for the first time, and so develop unprecedented insight into its composition/structure and improve understanding of the fundamental physical principles that underpin the Universe and the way it works.

For nearly a decade, scientists have been able to undertake the controlled production of anti-hydrogen atoms in the laboratory – a breakthrough which Swansea University also contributed to, with EPSRC support**. But as anti-matter particles are instantly annihilated when they come into contact with matter, it has not, until now, been feasible to study anti-hydrogen atoms in any detail.

ALPHA has therefore developed techniques that not only cool and slow down the anti-particles that make up anti-hydrogen and gently mix them to produce anti-hydrogen atoms, but also trap some of the anti-atoms for long enough so they can be studied.

The key focus of this effort has been the development of electromagnetic traps that have a number of cold species inside. These traps don't just provide the conditions needed to cool the anti-particles prior to mixing. The cold anti-atoms formed also have a tiny 'magnetic moment'*** which means they respond to magnetic fields. By arranging the magnet coils in the right way, it is possible to set up a magnetic 'well' in the centre of the anti-particle mixing zone where anti-hydrogen has been trapped.

"Every type of particle has its anti-matter equivalent which is its mirror image in terms of having, for instance, the opposite electrical charge" says Professor Charlton. "Because hydrogen is the simplest of all atoms, anti-hydrogen is the easiest type of anti-matter to produce in the laboratory. By studying it for the first time, we will be able to understand its properties and establish whether it really is the exact mirror image of hydrogen.

"That understanding will hopefully enable us to shed light on exactly why almost everything in the known Universe consists of matter, rather than anti-matter, and what the implications are in terms of the fundamental way that the Universe functions."

In order to detect the anti-hydrogen atoms they were released from the trap. The silicon detector used to determine the positions of the resulting annihilations was developed and built at Liverpool. Professor Nolan comments that "the unique clean room and workshop facilities in Liverpool, together with detector and electronics expertise, allowed us to build this complex and unique instrument that is now part of the ALPHA experiment."

Dr Niels Madsen notes: "Trapping of anti-hydrogen is a major breakthrough in antimatter physics. Having the anti-atoms trapped will allow for comparisons of matter and anti-matter to a level that until now would have been considered wishful thinking."

The initiative is expected to run for several years, with ALPHA commencing tests on anti-hydrogen atoms in around five years time. 

Source: Reprinted news release via Engineering and Physical Sciences Research Council

Related Stories:

Researchers Uncover Surprise Link Between Weird Quantum Phenomena

Researchers have uncovered a fundamental link between the two defining properties of quantum physics. Stephanie Wehner of Singapore's Centre for Quantum Technologies and the National University of Singapore and Jonathan Oppenheim of the United Kingdom's University of Cambridge published their work today in the latest edition of the journal Science.

The result is being heralded as a dramatic breakthrough in our basic understanding of quantum mechanics and provides new clues to researchers seeking to understand the foundations of quantum theory. The result addresses the question of why quantum behaviour is as weird as it is—but no weirder.

The strange behaviour of quantum particles, such as atoms, electrons and the photons that make up light, has perplexed scientists for nearly a century. Albert Einstein was among those who thought the quantum world was so strange that quantum theory must be wrong, but experiments have borne out the theory's predictions.

One of the weird aspects of quantum theory is that it is impossible to know certain things, such as a particle's momentum and position, simultaneously. Knowledge of one of these properties affects the accuracy with which you can learn the other. This is known as the "Heisenberg Uncertainty Principle".

Another weird aspect is the quantum phenomenon of non-locality, which arises from the better-known phenomenon of entanglement. When two quantum particles are entangled, they can perform actions that look as if they are coordinated with each other in ways that defy classical intuition about physically separated particles.

Previously, researchers have treated non-locality and uncertainty as two separate phenomena. Now Wehner and Oppenheim have shown that they are intricately linked. What's more, they show that this link is quantitative and have found an equation which shows that the "amount" of non-locality is determined by the uncertainty principle.

"It's a surprising and perhaps ironic twist," said Oppenheim, a Royal Society University Research Fellow from the Department of Applied Mathematics & Theoretical Physics at the University of Cambridge. Einstein and his co-workers discovered non-locality while searching for a way to undermine the uncertainty principle. "Now the uncertainty principle appears to be biting back."

Non-locality determines how well two distant parties can coordinate their actions without sending each other information. Physicists believe that even in quantum mechanics, information cannot travel faster than light. Nevertheless, it turns out that quantum mechanics allows two parties to coordinate much better than would be possible under the laws of classical physics. In fact, their actions can be coordinated in a way that almost seems as if they had been able to talk. Einstein famously referred to this phenomenon as "spooky action at a distance".

However, quantum non-locality could be even spookier than it actually is. It's possible to have theories which allow distant parties to coordinate their actions much better than nature allows, while still not allowing information to travel faster than light. Nature could be weirder, and yet it isn't – quantum theory appears to impose an additional limit on the weirdness.

"Quantum theory is pretty weird, but it isn't as weird as it could be. We really have to ask ourselves, why is quantum mechanics this limited? Why doesn't nature allow even stronger non-locality?" Oppenheim says.

The surprising result by Wehner and Oppenheim is that the uncertainty principle provides an answer. Two parties can only coordinate their actions better if they break the uncertainty principle, which imposes a strict bound on how strong non-locality can be.

"It would be great if we could better coordinate our actions over long distances, as it would enable us to solve many information processing tasks very efficiently," Wehner says. "However, physics would be fundamentally different. If we break the uncertainty principle, there is really no telling what our world would look like."

How did the researchers discover a connection that had gone unnoticed so long? Before entering academia, Wehner worked as a 'computer hacker for hire', and now works in quantum information theory, while Oppenheim is a physicist. Wehner thinks that applying techniques from computer science to the laws of theoretical physics was key to spotting the connection. "I think one of the crucial ideas is to link the question to a coding problem," Wehner says. "Traditional ways of viewing non-locality and uncertainty obscured the close connection between the two concepts."

Wehner and Oppenheim recast the phenomena of quantum physics in terms that would be familiar to a computer hacker. They treat non-locality as the result of one party, Alice, creating and encoding information and a second party, Bob, retrieving information from the encoding. How well Alice and Bob can encode and retrieve information is determined by uncertainty relations. In some situations, they found that and a third property known as "steering" enters the picture.

Wehner and Oppenheim compare their discovery to uncovering what determines how easily two players can win a quantum board game: the board has only two squares, on which Alice, can place a counter of two possible colours: green or pink. She is told to place the same colour on both squares, or to place a different colour on each. Bob has to guess the colour that Alice put on square one or two. If his guess is correct, Alice and Bob win the game. Clearly, Alice and Bob could win the game if they could talk to each other: Alice would simply tell Bob what colours are on squares one and two. But Bob and Alice are situated so far apart from each other that light – and thus an information-carrying signal – does not have time to pass between them during the game.

If they can't talk, they won't always win, but by measuring on quantum particles, they can win the game more often than any strategy which doesn't rely on quantum theory. However, the uncertainty principle prevents them from doing any better, and even determines how often they lose the game.

The finding bears on the deep question of what principles underlie quantum physics. Many attempts to understand the underpinnings of quantum mechanics have focused on non-locality. Wehner thinks there may be more to gain from examining the details of the uncertainty principle. "However, we have barely scratched the surface of understanding uncertainty relations," she says.

The breakthrough is future-proof, the researchers say. Scientists are still searching for a quantum theory of gravity and Wehner and Oppenheim's result concerning non-locality, uncertainty and steering applies to all possible theories – including any future replacement for quantum mechanics.

Source: Reprinted news release via Centre for Quantum Technologies at the National University of Singapore

Related Stories:

Wednesday, November 17, 2010

International Research Team Trap 38 Antimatter Atoms

Angels & Demons (Single-Disc Theatrical Edition)In the movie Angels and Demons, scientists have solved one of the most perplexing scientific problems: the capture and storage of antimatter. In real life, trapping atomic antimatter has never been accomplished, until now.

A team made up of researchers from the University of Calgary, institutions across Canada and around the world have discovered how to trap atomic antimatter and the results of their discovery is published in the journal Nature.

“This is a major discovery. It could enable experiments that result in dramatic changes to the current view of fundamental physics or in confirmation of what we already know now,” says Dr. Rob Thompson, head of physics and astronomy at the University of Calgary and co-investigator in the ALPHA collaboration, one of two teams competing to gain a better understanding of antimatter and our universe.

Both teams, ALPHA and the Harvard-led ATRAP, have been at this race for over five years, conducting experiments in close quarters at CERN (European Organization for Nuclear Research), the world's largest particle physics lab, located in the suburbs of Geneva, Switzerland. CERN is the only laboratory in the world with the proper equipment where this research can be carried out.

“These are significant steps in antimatter research,” said CERN Director General Dr. Rolf Heuer, “and an important part of the very broad research programme at CERN.”

The goal of the competition involves trapping and storing the simplest of all antimatter atoms, antihydrogen, with the purpose of studying it. Hydrogen is the lightest and most abundant chemical element.

“We know a lot about matter, but very little about antimatter. We assume there was as much antimatter created in the Big Bang as matter. There are many questions. Where is the antimatter? Where did it go? And why does it appear that there is more matter than antimatter?”  says Dr. Makoto Fujiwara, adjunct professor in the Department of Physics and Astronomy at the University of Calgary and a research scientist at TRIUMF, Canada's national laboratory for particle and nuclear physics.

ALPHA-Canada scientists and students have been playing leading roles in the experiment. “It's been a rare privilege and a tremendous learning experience taking part in this groundbreaking international endeavour,” says Richard Hydomako, a PhD student at the University of Calgary.

Trapping antimatter is tricky. When matter and antimatter get too close, they destroy each other, in a kind of explosion, leaving behind the energy which made them. The challenge is cooling the atoms off enough, 272 degrees below zero, so that they are slow enough to be trapped in a magnetic storage device.

“We've been able to trap about 38 atoms, which is an incredibly small amount, nothing like what we would need to power Star Trek's starship Enterprise or even to heat a cup of coffee,” says Thompson , one of 42 co-authors of the Nature paper along with the University of Calgary’s Makoto Fujiwara and graduate students Richard Hydomako and Tim Friesen.

“Now we can start working on the next step which is to use tools to study it,” adds Thompson.

The paper entitled Trapped Hydrogen is published in Nature
Source: Reprinted news release via University of Calgary

Related Posts: