تیکنالوژی

Monday, January 24, 2011

The Nuclear Age


NUCLEAR AGE




The Earth exploded into the nuclear age on 16 July 1945. On that day, the US tested a completely new type of weapon in the New Mexico desert. Crafted from a tennis-ball-sized plutonium sphere, the Trinity bomb produced an explosion equivalent to 20,000 tonnes of TNT.
Sixty years on, tens of thousands of tonnes of plutonium and enriched uranium have been produced. The global nuclear arsenal stands at about 27,000 bombs. Nine countries very probably possess nuclear weapons, while 40 others have access to the materials and technology to make them.
But nuclear technology has also been used for peaceful means. The first nuclear reactor to provide electricity to a national grid opened in England in 1956. Now, 442 reactors in 32 nations generate 16% of the world's electricity.
Nuclear power has been championed as a source of cheap energy. But this was undermined at the end of the 20th century by high-profile reactor accidents, the problems of radioactive waste disposal, competition from more-efficient electricity sources and unavoidable links to nuclear weapons proliferation. Nonetheless, growing evidence for global warming had led some to argue that nuclear power is the only way to generate power without emitting greenhouse gases.

Splitting atoms

The first steps towards unleashing the power within the atomic nucleus began in 1905 when Albert Einstein established that even tiny quantities of mass are equivalent to immense amounts of energy, through his equation E=mc2. In 1938, Germans Otto Hahn and Fritz Strassman split inherently unstable uranium atoms by bombarding them with neutrons. The following year, Lise Meitner and Otto Frisch elucidated this process of nuclear fission, in which atomic nuclei are split to create nuclei of lighter elements, with neutrons and energy as by-products.
In 1941 the US embarked on the top secret Manhattan Project, which developed the bombs dropped on Hiroshima and Nagasaki at the end of World War Two. These are the only times nuclear weapons have been used in combat, though about 2000 have been tested. The Manhattan project cost $2 billion dollars and involved the work of 175,000 people, eight of whom were Nobel-prize winning physicists. The list includes Robert Oppenheimer, Enrico Fermi, Richard Feynman, Niels Bohr and Leo Szilard.
Uranium is the heaviest element found in nature in more than trace amounts, and natural ores contain two isotopes: U-238 and U-235. Only U-235, which makes up just 0.7% of ores, is fissile. So the uranium must be "enriched" to remove U-238 - highly enriched weapons-grade uranium can be up to 90% U-235.
When bombarded with neutrons, U-235 atoms absorb them and become unstable. They split to form two smaller nuclei of other elements and neutrons. Some of the mass is converted to energy in the form of gamma radiation and heat. Because only one neutron is needed to trigger fission and two or three are released, a chain reaction can result. This reaction is uncontrolled in an atomic bomb but tightly controlled in a nuclear reactor.

Dropping the bomb


The Hiroshima bomb was made of enriched uranium, compressed by detonating explosives to achieve a supercritical mass. The Nagasaki bomb was made of plutonium, which is also fissile. Plutonium is produced in the spent fuel of a nuclear reactor, via the irradiation of uranium 238. It can be extracted to create weapons.
Following 1945, the US developed massively destructive hydrogen bombs. Some are equivalent to many millions of tonnes of TNT, and yield vast amounts of energy through nuclear fusion. In nuclear fusion, atomic nuclei fuse to form heavier elements. Hydrogen bombs use small fission explosions to create the huge temperatures required for heavy isotopes of hydrogen to fuse.
Nuclear weapons technology has been adapted for many military uses, such as intercontinental missiles, huge fission weapons, bunker busters, mini-nukes, gamma ray weapons, nuclear landmines and nuclear defence missiles.
By bombing Japan, the US started a worldwide arms race, and the Cold War with the Soviet Union. The Soviets developed and tested their own bomb in 1949. The United Kingdom achieved the feat in 1952, followed by France in 1960, China in 1964 and most recently India and Pakistan in 1998.
Israel is widely thought to possess nuclear weapons and North Korea declared in 2005 that it did too, though neither has conducted tests. Iraq and Libya have attempted to develop them in the past, and Iran has been accused of having a secret nuclear weapons programme.

Stemming proliferation

While up to nine nations have nuclear weapons, 187 others have pledged not to manufacture them. Twenty countries such as Switzerland, Brazil, Argentina, Canada and South Africa once had programmes; but as signatories to the 1968 Treaty on the Non-Proliferation of Nuclear Weapons (NPT), subsequently abandoned them.
The NPT aimed to limit the spread of atomic weapons and bound the five original nuclear weapons states to sharing nuclear technology and materials for peaceful means - mainly through US and Russian disarmament, the treaty has achieved the decommisioning of 38,000 warheads since 1986.



However, the treaty is under strain in 2005. Nuclear-armed states stand accused of failing to reduce their arsenals, and of considering new weapons, like mini-nukes. Iran reached an agreement with Europe to halt uranium enrichment activities, but may renege on that deal.
The 1996 Comprehensive Nuclear Test Ban Treaty is an attempt to limit test detonations and slow nuclear armament, but the US senate refused to ratify it in 1999.
Controlling the remains of the Soviet Union's vast and poorly protected nuclear arsenal is another great challenge. The G8 have repeatedly pledged billions of dollars to help safeguard the massive stockpile.
The International Atomic Energy Agency is struggling to keep track of smuggling and the black market in nuclear materials and technology, and fears of terrorists acquiring a dirty bomb are frequently expressed. The sale of materials and information was highlighted in 2004, when a Pakistani nuclear scientist admitted to selling nuclear technology to Libya, North Korea and Pakistan.

Atoms for peace

Nuclear power generation has been linked to nuclear weapon proliferation. In fact, the first industrial-scale reactors, built in the US in 1944, were designed to produce plutonium for weapons and the energy generated was wasted. The first nuclear reactor to provide electricity to a national grid opened in Calder Hall in England in 1956. Today countries such as Japan and France use nuclear power to provide up to 75% of their energy.
Unlike in atomic weapons, nuclear reactors must tightly control the fission chain reaction. To prevent a runaway reaction, control rods are interspersed with the fuel rods of uranium or plutonium. The control rods absorb neutrons, and can be lowered into the reactor core to regulate energy output. A moderating substance, such as water or graphite, surrounds the rods, slowing neutrons emitted by the reaction, and deflecting them back to the centre.
A coolant circulates around the core, and is pumped to a heat exchanger, where water becomes steam and drives electricity-generating turbines. Advanced gas-cooled reactors, such as those used in the UK, use compressed carbon dioxide as the coolant. Light-water, heavy-water and pressurised-water reactors, use water as moderator and coolant.
These reactors are inherently inefficient, only utilising around 1% of the energy stored in the uranium fuel. To overcome this inefficiency and minimise nuclear waste, some countries re-process nuclear fuel. The Sellafield facility in the UK is the largest re-processing facility in the world, but has suffered many problems.
More advanced (but less safe) breeder reactors use liquid sodium metal as a coolant and generate plutonium fuel. Breeder reactors such Superphénix in France, Dounreay in the UK, Monju in Japan and planned reactors in India, can utilise up to 75% of the energy contained in uranium. New miniature Rapid-L reactors might one day even provide power in the basements of apartment blocks and "take-away", portable reactors are planned for the future.
Nuclear fuel has also been used to power submarines, such as Russia's doomed Kursk; spacecraft such as Cassini, Galileo and the failed Mars-96; and ice breakers, aircraft carriers and other ships. The Pentagon even briefly entertained the idea of a nuclear-powered jet.

Going critical


However, several high profile accidents damaged public confidence in nuclear power. The worst US nuclear accident was in 1979, when a cooling system malfunctioned at Three Mile Island in Pennsylvania. The reactor melted down, releasing radioactive gas into the environment. There are now concerns about safety with other ageing US reactors.
The world's most catastrophic nuclear accident happened in 1986, at Chernobyl in Ukraine. Control rods were withdrawn from the reactor in misguided safety test, causing meltdown and massive explosions. The radiation released killed 30 people directly and spread over northern Europe.
The accident has led to radiation-induced conditions such as thyroid cancers and leukaemia, birth defects, baby deaths and contamination to lakes and forests. Three other reactors at Chernobyl began working again in 1988, but the last finally closed in 2000 after Western nations eventually paid Ukraine to close it. Similar reactors in Eastern Europe may be just as dangerous.
In 1999, 70 people were exposed to radiation in Japan's Tokaimura uranium processing plant after workers added seven times the safe quantity of uranium to a settling tank. This triggered an uncontrolled chain reaction. Many other hazardous or lethal accidents have occurred in facilities such as Windscale, Sellafield, Mayak, Monju, Tsuruga and Mihama.
Radioactive nuclear waste - which remains dangerous for many thousands of years - is another serious drawback of the industry. Governments have considered disposing of it by reprocessing; burying it deep underground, such in Nevada's Yucca Mountain in the US; burning it; shipping it to other countries; zapping it with giant lasers; encasing it in glass blocks and storing it on-site at nuclear facilities.
But concerns have been raised about potential flooding of repositories, secret disposal sites and the risks of transporting waste. Cleaning up decommissioned nuclear sites is also expensive and difficult.
Yet nuclear power still has one advantage that could prompt a comeback - the lack of greenhouse gas emissions. Some now tout it as a good way to reduce the emissions linked to global warming. The US government has already announced plans for a raft of new nuclear power stations - the first since 1979.

Parkinson's in New Technology

Faecal transplant eases symptoms of Parkinson's


Parkinson's



A FEW years ago, John Gillies had trouble picking up his grandchild. He would stand frozen, waiting for his Parkinson's disease to relinquish its hold and allow him to move. Then in May 2008, Gillies was given antibiotics to treat constipation, and astonishingly his Parkinson's symptoms abated. What on earth was going on?
Thomas Borody, a gastroenterologist at the Centre for Digestive Diseases in New South Wales, Australia, put Gillies on antibiotics because he had found that constipation can be caused by an infection of the colon. "He has now been seen by two neurologists, who cannot detect classic Parkinson's disease symptoms any more," says Borody.
Borody's observations, together with others, suggest that many conditions, from Parkinson's to metabolic disorders such as obesity, might be caused by undesirable changes in the microbes of the gut. If that is true, it might be possible to alleviate symptoms with antibiotics, or even faecal transplants using donor faeces to restore the bowel flora to a healthy state.
Borody uses faecal transplants to cure people infected by the superbug Clostridium difficile, and to alleviate chronic constipation. Over the past decade, Borody has noticed that some of his patients also see improvements in symptoms of their other diseases, including Parkinson's, multiple sclerosis (MS), chronic fatigue syndrome (CFS) and rheumatoid arthritis. "Some CFS patients, given a faecal transplant, will regain their energy quite dramatically, and their foggy brains will get better," says Borody.
To test a possible link between the gut and Parkinson's disease, Borody and neurologist David Rosen of the Prince of Wales Private Hospital in Sydney are embarking on a pilot study, hoping to recruit people with both constipation and Parkinson's. The plan is first to treat them with antibiotics and eventually with faecal transplants. They hope both faecal transplants and antibiotics will treat gut infection and hence Parkinson's.
Rosen is cautious: "I wouldn't for one minute be suggesting that this is the next cure," he says. But the idea that Parkinson's could be caused by bacteria dovetails with work by neuroanatomists Heiko Braak and Kelly Del Tredici at the University of Ulm in Germany.
In 2003, Braak and Tredici showed that damage to the nervous system in Parkinson's progresses from the vagus nerve in the lower brain stem to the higher regions of the brain and eventually to the cerebral cortex. They also found damage in the enteric nervous system, which controls the gastrointestinal (GI) tract and communicates with the brain via the vagus nerve. This discovery prompted them to suggest that Parkinson's might be caused by a bug that breaks through the mucosal barrier of the GI tract and enters the central nervous system via the vagus nerve (Journal of Neural Transmission, DOI: 10.1007/s00702-002-0808-2).
So what about the dramatic improvements seen in people with autoimmune diseases, such as rheumatoid arthritis, after faecal transplant? Borody's hypothesis is that an infection of the colon releases antigens into the bloodstream, which trigger an immune response. Unless something is done to completely clear the colon of the antigen, the immune response is relentless, eventually leading to systemic inflammation that manifests itself as an autoimmune disease.
Interpreting Borody's results requires extreme caution. However, there is evidence from animal models that intestinal microbes can influence autoimmunity. For instance, Alexander Chervonsky of the University of Chicago and colleagues have linked microbes in the gut to type 1 diabetes, an autoimmune disorder caused by the destruction of insulin-secreting pancreatic cells. Over 80 per cent of a particular breed of engineered mice that are kept germ-free develop type 1 diabetes. When the same mice were dosed with a cocktail of bacteria similar to those present in the human gut, only 34 per cent of the mice developed type 1 diabetes, suggesting a connection between gut flora and autoimmune diabetes (Nature, DOI: 10.1038/nature07336).
Researchers are becoming increasingly aware of the link between gut flora and autoimmunity, says Arthur Kaser, an expert on inflammation and intestinal flora at the University of Cambridge. For instance, mice designed to develop autoimmune diseases do so in some labs but not in others. The discrepancy is down to differences in the intestinal flora of the mice. "Intestinal microbiota has a dramatic effect on [what] we currently consider as autoimmune disease," says Kaser.
Evidence for such links in humans is also growing: Anne Vrieze of the Academic Medical Center in Amsterdam, the Netherlands, and colleagues studied 18 obese men with metabolic syndrome, a collection of symptoms that includes low insulin sensitivity. The group received faecal transplants - either of their own stool or stool from lean, healthy donors.
The results of this first double-blind trial were presented at the annual meeting of the European Association for the Study of Diabetes in Stockholm, Sweden, in September. The researchers found that, six weeks after the infusions, insulin sensitivity improved significantly in the nine men who received donor stool.





Gut flora has also been linked to obesity. Over the past five years, Jeffrey Gordon of Washington University in St Louis, Missouri, and colleagues have shown that there are marked differences in the gut flora of obese and lean individuals. Their analysis suggested that the microbes in obese individuals are releasing nutrients from food that would have remained undigested in lean individuals. Importantly, they showed that transferring the microbiota from obese mice into lean mice caused the lean mice to put on weight (Nature, DOI: 10.1038/nature05414).
So can you reverse obesity in humans by transferring gut microbes from lean people into obese people? It's a question that Alex Khoruts, at the University of Minnesota Medical School in Minneapolis, hopes to answer. He is planning a trial in which obese people will be given faecal transplants, either of their own faeces or samples taken from lean, healthy donors. "The idea is to alter the composition of colon flora, and see whether it has an impact on obesity," says Khoruts.
"This is absolutely exciting," says Kaser. But he insists that we are far from understanding the nature of the microbes that populate our body - after all, the colon alone contains nine times as many bacterial cells as there are human cells in the body. And we don't yet know what constitutes "healthy" colon flora. This will make it difficult to justify any large-scale adoption of faecal transplants, he adds. If intestinal bugs are indeed causing autoimmune diseases, "you don't want to treat one disease and introduce another", says Kaser.
Nonetheless, he is convinced that human microbiota will become increasingly important in our understanding of disease. "Textbooks will have to be rewritten when we consider the contribution of intestinal microbiota," he says. "We have an elephant in the room that has not yet been appreciated."

Green Machine: Bringing a forest to the desert

It may sound like an environmentalist's pipe dream, but giant greenhouses could soon be popping up in some of the world's deserts, producing fresh drinking water, food and fuel.


The Sahara Forest Project, which aims to create green oases in desert areas, has signed a deal to build a pilot plant in Aqaba, near the Red Sea in Jordan. With funding from the Norwegian government, the team plans to begin building the pilot plant on a 200,000 square metre site in 2012.
The world has an abundance of sunlight, seawater, carbon dioxide and arid land, says Joakim Hauge, CEO of the Sahara Forest Project. "These resources could be used for profitable and sustainable production of food, water and renewable energy, while combating the greenhouse effect through binding CO2 in new vegetation in arid areas."

If all goes to plan, the plant will consist of a saltwater greenhouse to grow vegetables and algae for fuel. Water piped from the Red Sea will cool air flowing into the greenhouse, providing good growing conditions for the crops. The air will then be passed over pipes containing seawater heated by the sun. The resulting hot, humid air will finally meet a series of vertical pipes containing cold seawater, causing fresh water to condense and run down the pipes to collectors below.
This fresh water will be heated by a Concentrating Solar Power Plant to provide steam to drive a turbine, generating electricity. In turn, the electricity will be used to power the greenhouse's pumps and fans. The water will also be used to grow crops around the greenhouse.
Finally, excess heat generated by the solar power plant will be used to produce drinking water through desalination.
The project has been developed by Max Fordham Consulting Engineers, Seawater Greenhouse, and Exploration Architecture, all based in London, and the Bellona Foundation in Oslo, Norway.

Friday, January 14, 2011

Wireless at the speed of plasma

Antennas that use plasma to focus beams of radio waves could bring us superfast wireless networks

BEFORE you leave for work in the morning, your smartphone downloads the latest episode of a television series. Your drive to work is easy in spite of fog, thanks to in-car radar and the intelligent transport software that automatically guides you around traffic jams, allowing you to arrive in time for a presentation in which high-definition video is streamed flawlessly to your tablet computer in real time.
This vision of the future may not be far off, thanks to a new type of antenna that makes use of plasma consisting of only electrons. It could revolutionise high-speed wireless communications, miniature radar and even energy weapons.
Existing directional antennas that transmit high-frequency radio waves require expensive materials or precise manufacturing. But the new antenna, called Plasma Silicon Antenna, or PSiAN, relies on existing low-cost manufacturing techniques developed for silicon chips. It has been developed by Plasma Antennas of Winchester, UK.
PSiAN consists of thousands of diodes on a silicon chip. When activated, each diode generates a cloud of electrons - the plasma - about 0.1 millimetres across. At a high enough electron density, each cloud reflects high-frequency radio waves like a mirror. By selectively activating diodes, the shape of the reflecting area can be changed to focus and steer a beam of radio waves. This "beam-forming" capability makes the antennas crucial to ultrafast wireless applications, because they can focus a stream of high-frequency radio waves that would quickly dissipate using normal antennas.
"Beam-forming antennas are the key for enabling next-generation, high-data-rate indoor wireless applications," says Anmol Sheth, at Intel Labs in Seattle. "Without beam-forming antennas it would be difficult to scale to the levels of density of wireless devices we expect to have in future homes."
There are two types of plasma antenna: semiconductor or solid-state antennas, such as PSiAN, and gas antennas. Both could fit the bill, but solid-state antennas are favoured as they are more compact and have no moving parts.
That makes them attractive for use in a new generation of ultrafast Wi-Fi, known as Wi-Gig. Existing Wi-Fi tops out at 54 megabits of data per second, whereas the Wi-Gig standard is expected to go up to between 1 and 7 gigabits per second - fast enough to download a television programme in seconds. Wi-Gig requires higher radio wave frequencies, though: 60 gigahertz rather than the 2.4 GHz used by Wi-Fi. Signals at these frequencies disperse rapidly unless they are tightly focused, which is where PSiAN comes in.
Ian Russell, business development director at Plasma Antennas, says that PSiAN is small enough to fit inside a cellphone. "Higher frequencies mean shorter wavelengths and hence smaller antennas," he says. "The antenna actually becomes cheaper at the smaller scales because you need less silicon."
The antennas shouldn't raise any health issues, as they are covered by existing safety standards. The narrow beam means there is less "overspill" of radiation than with existing omnidirectional antennas.
As well as speeding up Wi-Fi, plasma antennas could also allow cars to come with low-cost miniature radar systems to help drivers avoid collisions. Their millimetre wavelengths could be used to "see" through fog or rain, and another set of antennas could listen for real-time updates on traffic and road conditions.




The US military is also interested in solid-state plasma antennas, for use in a more advanced version of their so-called "pain beam", a weapon called the Active Denial System. The ADS heats a person's skin painfully with a beam of 64 GHz radio waves. But the current design involves a 2-metre-wide, mechanically steered antenna mounted on a large truck. Switching to a small, lightweight plasma antenna would allow multiple narrow beams to selectively target several individuals at once.
Ted Anderson of Haleakala R&D, based in Brookfield, Massachusetts, has been involved in the development of gas plasma antennas for many years. He points out that although the solid-state version is compact, it is limited to high frequencies, making certain applications tricky. For instance, indoor Wi-Gig routers operating at 60 GHz wouldn't be able to penetrate walls. The signal would instead have to be reflected off surfaces to reach every room in a house.
"Semiconductor plasma antennas will work at only high frequencies, between 1 GHz and 100 GHz," says Anderson. "Theoretically, we see no upper or lower bound to ionised gas antennas in the radio frequency spectrum."
Russell says that PSiAN could be commercially available within two years. At present, getting movies and high-quality images on and off our smartphones almost certainly means hooking them into a computer. But as the demand for such content increases, the only way to break the wire is going to be an ultrafast wireless connection. When it comes, it may very well be in the form of plasma.

Thursday, January 13, 2011

Scorn over claim of teleported DNA

A Nobel prizewinner is reporting that DNA can be generated from its teleported "quantum imprint"

A STORM of scepticism has greeted experimental results emerging from the lab of a Nobel laureate which, if confirmed, would shake the foundations of several fields of science. "If the results are correct," says theoretical chemist Jeff Reimers of the University of Sydney, Australia, "these would be the most significant experiments performed in the past 90 years, demanding re-evaluation of the whole conceptual framework of modern chemistry."

Luc Montagnier, who shared the Nobel prize for medicine in 2008 for his part in establishing that HIV causes AIDS, says he has evidence that DNA can send spooky electromagnetic imprints of itself into distant cells and fluids. If that wasn't heretical enough, he also suggests that enzymes can mistake the ghostly imprints for real DNA, and faithfully copy them to produce the real thing. In effect this would amount to a kind of quantum teleportation of the DNA.
Many researchers contacted for comment by New Scientist reacted with disbelief. Gary Schuster, who studies DNA conductance effects at Georgia Institute of Technology in Atlanta, compared it to "pathological science". Jacqueline Barton, who does similar work at the California Institute of Technology in Pasadena, was equally sceptical. "There aren't a lot of data given, and I don't buy the explanation," she says. One blogger has suggested Montagnier should be awarded an IgNobel prize.
Yet the results can't be dismissed out of hand. "The experimental methods used appear comprehensive," says Reimers. So what have Montagnier and his team actually found?
Full details of the experiments are not yet available, but the basic set-up is as follows. Two adjacent but physically separate test tubes were placed within a copper coil and subjected to a very weak extremely low frequency electromagnetic field of 7 hertz. The apparatus was isolated from Earth's natural magnetic field to stop it interfering with the experiment. One tube contained a fragment of DNA around 100 bases long; the second tube contained pure water.
After 16 to 18 hours, both samples were independently subjected to the polymerase chain reaction (PCR), a method routinely used to amplify traces of DNA by using enzymes to make many copies of the original material. The gene fragment was apparently recovered from both tubes, even though one should have contained just water (see diagram).
DNA was only recovered if the original solution of DNA - whose concentration has not been revealed - had been subjected to several dilution cycles before being placed in the magnetic field. In each cycle it was diluted 10-fold, and "ghost" DNA was only recovered after between seven and 12 dilutions of the original. It was not found at the ultra-high dilutions used in homeopathy.
Physicists in Montagnier's team suggest that DNA emits low-frequency electromagnetic waves which imprint the structure of the molecule onto the water. This structure, they claim, is preserved and amplified through quantum coherence effects, and because it mimics the shape of the original DNA, the enzymes in the PCR process mistake it for DNA itself, and somehow use it as a template to make DNA matching that which "sent" the signal (arxiv.org/abs/1012.5166).
"The biological experiments do seem intriguing, and I wouldn't dismiss them," says Greg Scholes of the University of Toronto in Canada, who last year demonstrated that quantum effects occur in plants. Yet according to Klaus Gerwert, who studies interactions between water and biomolecules at the Ruhr University in Bochum, Germany, "It is hard to understand how the information can be stored within water over a timescale longer than picoseconds."

"The structure would be destroyed instantly," agrees Felix Franks, a retired academic chemist in London who has studied water for many years. Franks was involved as a peer reviewer in the debunking of a controversial study in 1988 which claimed that water had a memory (see "How 'ghost molecules' were exorcised"). "Water has no 'memory'," he says now. "You can't make an imprint in it and recover it later."
Despite the scepticism over Montagnier's explanation, the consensus was that the results deserve to be investigated further. Montagnier's colleague, theoretical physicist Giuseppe Vitiello of the University of Salerno in Italy, is confident that the result is reliable. "I would exclude that it's contamination," he says. "It's very important that other groups repeat it."
In a paper last year (Interdisciplinary Sciences: Computational Life Sciences, DOI: 10.1007/s12539-009-0036-7), Montagnier described how he discovered the apparent ability of DNA fragments and entire bacteria both to produce weak electromagnetic fields and to "regenerate" themselves in previously uninfected cells. Montagnier strained a solution of the bacterium Mycoplasma pirum through a filter with pores small enough to prevent the bacteria penetrating. The filtered water emitted the same frequency of electromagnetic signal as the bacteria themselves. He says he has evidence that many species of bacteria and many viruses give out the electromagnetic signals, as do some diseased human cells.
Montagnier says that the full details of his latest experiments will not be disclosed until the paper is accepted for publication. "Surely you are aware that investigators do not reveal the detailed content of their experimental work before its first appearance in peer-reviewed journals," he says.

How 'ghost molecules' were exorcised

The latest findings by Luc Montagnier evoke long-discredited work by the French researcher Jacques Benveniste. In a paper in Nature (vol 333, p 816) in 1988 he claimed to show that water had a "memory", and that the activity of human antibodies was retained in solutions so dilute that they couldn't possibly contain any antibody molecules (New Scientist, 14 July 1988, p 39).
Faced with widespread scepticism over the paper, including from the chemist Felix Franks who had advised against publication, Nature recruited magician James Randi and chemist and "fraudbuster" Walter Stewart of the US National Institutes of Health in Bethesda, Maryland, to investigate Benveniste's methods. They found his result to be "a delusion", based on a flawed design. In 1991, Benveniste repeated his experiment under double-blind conditions, but not to the satisfaction of referees at Nature and Science. Two years later came the final indignity when he was suspended for damaging the image of his institute. He died in October 2004.
That's not to say that quantum effects must be absent from biological systems. Quantum effects have been proposed in both plants and birds. Montagnier and his colleagues are hoping that their paper won't suffer the same fate as Benveniste's.

Tuesday, January 11, 2011

The Future is Electrifying

Gentlemen - and women - plug in your engines. This will be the year of the electric car. No, seriously. After seemingly endless testing, technical hiccups and plain reluctance on the part of manufacturers to move electric vehicles from the concept phase to the showroom, it's finally happening. A fleet of new cars powered by the plug instead of the pump will take to the road in 2011.

Leading the charge is the Chevy Volt. With a 16-kilowatt-hour battery and a 110-kilowatt (149-horsepower) electric motor, it can go 60 kilometres on a single charge, plenty for commuting and weekend grocery runs. Critics point out that a 1.4-litre gasoline engine kicks in when the battery runs down, making the Volt a mere hybrid rather than a fully fledged electric car. And with demand for the Volt forecast to far outstrip supply, some dealers in the US are reportedly slapping steep premiums on top of the already hefty $40,280 price tag.
Even if the Volt fizzles, the Nissan Leaf, Ford Focus Electric and Renault Fluence will all be widely available in the next 12 months. Then there's Mitsubishi's diminutive i-MiEV, powered by a 47-kilowatt electric motor and boasting a range of 160 kilometres. It has been on the road in Japan since 2009 and is expected to go on sale in both the UK and the US in the new year.
Two factors have combined to bring electric cars to the mass market at last: the arrival of high-capacity batteries and the near-collapse of the American auto industry, which forced US car makers into building small, efficient vehicles that can compete with foreign offerings.
The biggest remaining obstacle is cost. Electric vehicles offer the amenities of a compact car at the price of a luxury sedan. Tax breaks in some countries should help. But if the quiet whoosh of the electric motor is to replace the growl of the internal combustion engine, prices will have to plummet. Competition and yet more innovation in battery and drivetrain technology could allow that to happen.

Contact lenses for health and head-up displays

Lenses that monitor eye health are on the way, and in-eye 3D image displays are being developed too – welcome to the world of augmented vision
THE next time you gaze deep into someone's eyes, you might be shocked at what you see: tiny circuits ringing their irises, their pupils dancing with pinpricks of light. These smart contact lenses aren't intended to improve vision. Instead, they will monitor blood sugar levels in people with diabetes or look for signs of glaucoma.
The lenses could also map images directly onto the field of view, creating head-up displays for the ultimate augmented reality experience, without wearing glasses or a headset. To produce such lenses, researchers are merging transparent, eye-friendly materials with microelectronics.


In 2008, as a proof of concept, Babak Parviz at the University of Washington in Seattle created a prototype contact lens containing a single red LED. Using the same technology, he has now created a lens capable of monitoring glucose levels in people with diabetes.
It works because glucose levels in tear fluid correspond directly to those found in the blood, making continuous measurement possible without the need for thumb pricks, he says. Parviz's design calls for the contact lens to send this information wirelessly to a portable device worn by diabetics, allowing them to manage their diet and medication more accurately.
Lenses that also contain arrays of tiny LEDs may allow this or other types of digital information to be displayed directly to the wearer through the lens. This kind of augmented reality has already taken off in cellphones, with countless software apps superimposing digital data onto images of our surroundings, effectively blending the physical and online worlds.
Making it work on a contact lens won't be easy, but the technology has begun to take shape. Last September, Sensimed, a Swiss spin-off from the Swiss Federal Institute of Technology in Lausanne, launched the very first commercial smart contact lens, designed to improve treatment for people with glaucoma.
The disease puts pressure on the optic nerve through fluid build-up, and can irreversibly damage vision if not properly treated. Highly sensitive platinum strain gauges embedded in Sensimed's Triggerfish lens record changes in the curvature of the cornea, which correspond directly to the pressure inside the eye, says CEO Jean-Marc Wismer. The lens transmits this information wirelessly at regular intervals to a portable recording device worn by the patient, he says.
Like an RFID tag or London's Oyster travel cards, the lens gets its power from a nearby loop antenna - in this case taped to the patient's face. The powered antenna transmits electricity to the contact lens, which is used to interrogate the sensors, process the signals and transmit the readings back.
Each disposable contact lens is designed to be worn just once for 24 hours, and the patient repeats the process once or twice a year. This allows researchers to look for peaks in eye pressure which vary from patient to patient during the course of a day. This information is then used to schedule the timings of medication.
"The timing of these drugs is important," Wisner says.
Parviz, however, has taken a different approach. His glucose sensor uses sets of electrodes to run tiny currents through the tear fluid and measures them to detect very small quantities of dissolved sugar. These electrodes, along with a computer chip that contains a radio frequency antenna, are fabricated on a flat substrate made of polyethylene terephthalate (PET), a transparent polymer commonly found in plastic bottles. This is then moulded into the shape of a contact lens to fit the eye.
Parviz plans to use a higher-powered antenna to get a better range, allowing patients to carry a single external device in their breast pocket or on their belt. Preliminary tests show that his sensors can accurately detect even very low glucose levels. Parvis is due to present his results later this month at the IEEE MEMS 2011 conference in Cancún, Mexico.
"There's still a lot more testing we have to do," says Parviz. In the meantime, his lab has made progress with contact lens displays. They have developed both red and blue miniature LEDs - leaving only green for full colour - and have separately built lenses with 3D optics that resemble the head-up visors used to view movies in 3D.
Parviz has yet to combine both the optics and the LEDs in the same contact lens, but he is confident that even images so close to the eye can be brought into focus. "You won't necessarily have to shift your focus to see the image generated by the contact lens," says Parviz. It will just appear in front of you, he says. The LEDs will be arranged in a grid pattern, and should not interfere with normal vision when the display is off.
For Sensimed, the circuitry is entirely around the edge of the lens (see photo). However, both have yet to address the fact that wearing these lenses might make you look like the robots in the Terminator movies. False irises could eventually solve this problem, says Parviz. "But that's not something at the top of our priority list," he says.

Sunday, January 2, 2011

Khost University CardioResting™ ECG

CardioResting™ ECG — is a complete interpretive real-time, 12-lead ECG cardiology system, capable of recording, analyzing and storing data when connected via USB to a Windows based PC. Historical comparison capability. A truly portable system, you can use it with a laptop or tablet computer and take it anywhere. Reliable and user-friendly. CardioResting™ ECG systems are compact and durable. Performing fast testing and managing them easily with you at the point-of-care.


CardioResting™ ECG Includes


  • CardioResting™ PC Based ECG System 
  • CardioCard™ Resting ECG Software
  • USB PC ECG Cable - Resting
  • Resting ECG Patient Cables
  • Operator Manual
  • Starter Kit 100 Resting ECG Electrodes
  • Free Training and Toll-free Technical Support
  • 2 Year Warranty on Software and Device.
    Parts & Labor.  Repair & Replacement
  • Free Software Upgrades during Warranty







CardioResting™ ECG Features

  • Networking: Built-in
  • Leads: Simultaneous interpretive 12- lead acquisition of your PC ECG/PC EKG
  • Real-time 1, 2, 3, 6 or 12- ECG lead color display of ECG/EKG complex with real time ST and HR
  • ECG Interpretation Classes: Mls, blocks, enlargements and axis
  • Cable: 10 foot ECG patient cable with locking replaceable leads
  • WINDOWS Based Software: Easy point and click user interface
  • Paper: Type 8.5” x 11” standard printer paper
  • Print Formats: Final 12- lead report to any WINDOWS printer
    (no fading, low cost and readily available)
  • View/print full disclosure measurements for all stages, ST MAX/ST average
  • Storage: Unlimited database
    Capture unlimited ECG/EKG durations of patient ECG/EKG data to any hard drive type
  • Full patient demographics and information in an integrated and practically unlimited database
  • Reports User selectable lead formats on final ECG/EKG report
  • Reports: ECG to PDF, fax, email and EMRs
  • Graphs/Tables: BP and HR trend, speed/grade and representative beats for all stages, etc
  • On-screen Lead quality indication
  • On-screen status of test: phase time, total time, speed, grade, target HR, 12- lead ST levels and slopes
  • Power Requirement USB connection
    (no batteries or extra power needed)
  • Measurements HR with global measurements, complete interval values and ST Information
  • Comparisons ECG/EKG serial historical comparisons
  • Protocols Industry leading number of pre-programmed protocols with pharmacological and user defined protocols
  • Interpretation MEANS interpretation program, Erasmus University Medical Center, Rotterdam
  • Filters High performance baseline filter
  • Pacemaker Detection
  • Baseline correction and other filters to minimize motion artifact
  • Warranty 2-year limited warranty
  • Gain Accuracy: ± 5%
  • Frequency Response 0.05 to 150 Hz
  • CMRR >120 dB
  • Input Impedance >1012 ohms
  • System Noise <40 mV RTI
  • Dynamic Range ± 5 mV
  • Dimensions: 2.75” W x 1.25” D x 4.87” H
  • Weight: .26 lb (.12 kg)

CardioCard™ Software seamlessly integrates CardioResting™, CardioStress™ and CardioHolter™ ECG tests as they are added. The CardioCard™ database is sorted by Patient Name and Social Security Number. Physician editing comments can be attached to the Patient Record.
Features:
  • Full 12- lead Interval and ST Slope/Elevation Measurements
  • Physician Editing Comments can be Appended to Patient Record
  • Store Data to Hard Drive, Diskette, Zip, Optical or any other Storage Medium
  • Export your ECG, Holter and Stress Reports to PDF, fax, email and EMRs
  • Electronic Calipers to msec allows Custom Measurement Capability
  • "Seamlessly" Integrates CardioResting ECG, CardioStress ECG and CardioHolter
  • Virtually Unlimited Storage of Patient Records
  • Data Transmission via Modem, FAX or Direct Connect
  • Single ECG or Batch Transmission Capability
  • "On Call ECG Receive" Capability
  • WINDOWS: Point & Click User Interface
  • Sorted by Patient Name and Social Security #
  • Network Interface
Minimum PC System Requirements
  • IBM compatible Pentium or higher
  • 512MB ram for program for Windows 2000/ XP/ Vista and Mac with boot camp.
  • CD ROM and USB Ports
  • Storage: 5 MB HD disk space for programs and patient data.
    • For standard 12- lead tests, 60kb are needed per patient test
  • Operating Systems: 2000/ XP/ Vista Mac with boot camp
  • Display: Any Windows compatible display
  • Printers: Any Windows compatible printer
  • Paper Type 8.5” x 11” standard printer paper
Sample Screen CardioCard™ Resting ECG Software