Archive for the ‘Technology’ Category

Peter Alwin wins Electrolux Design Lab 2010

September 23, 2010

Peter Alwin from National Institute of Design in India is the winner of the Electrolux Design Lab 2010 competition for inventing The Snail.

The Snail by Peter Alwin

The Snail by Peter Alwin

The Snail is a portable heating and cooking device based on magnetic induction processes. Such is the size and versatility of the Snail, it can be stuck directly on to a pot, a pan, a mug etc. to heat the contents.This reduces the amount of space required for conventional cooking whilst adding portability to the process. Powered by a high density sugar crystal battery, the Snail converts the energy from the sugar, heating up a coil to conduct the magnetic induction process to the utensil. Integrated sensors detect the food type being heated so as to automatically adjust the time and temperature. A simple touch sensitive display with interface helps to monitor the process.

The top eight finalists

FIRST PLACE: The Snail, Micro Induction Heating by Peter Alwin, National Institute of Design, India

Bio robot refrigerator

SECOND PLACE: Bio Robot Refrigerator, Cool, Green, Food Preservation by Yuriy Dmitriev, CSU, Russia

THIRD PLACE: Elements Modular Kitchen, All-In-One Kitchen Shelving by Matthew Gilbride, North Carolina State University, USA

Elements modular kitchen

PEOPLE’S CHOICE: Bio Robot Refrigerator, Cool, Green, Food Preservation by Yuriy Dmitriev, CSU, Russia

  • The Kitchen Hideaway, by Daniel Dobrogorsky, Monash University, Australia
  • Clean Closet, All in One Laundry Concept by Michael Edenius, Umeå Institute of Design, Sweden
  • Dismount Washer, Wash & Go Laundry by Lichen Guo, Zhejiang University of Technology, China
  • External Refrigerator, External Cooling by Nicolas Hubert, L’Ecole de Design Nantes Atlantique, France
  • Eco Cleaner, the Portable, Compact Dishwasher by Ahi Andy Mohsen, Elm o Sanat University, Iran

See slide show at

http://news.discovery.com/tech/future-appliances-mega-cities-electrolux.html

Stuttgart’s white elephant

September 23, 2010

Hamada Marine "Bridge to Nowhere"

Japan is famous for its bridges to nowhere and highways without traffic but Germany is not immune from this extravagant form of supporting the construction industry and their powerful lobbies.

Der Spiegel runs a scathing attack on the white elephant that is “Stuttgart 21” and Deutsche Bahn‘s CEO Rüdiger Grube:

A multibillion railway development project is going ahead in Stuttgart, despite the fact that it offers hardly any benefits for the rail network and the money would be better spent elsewhere. Experts have been warning against the plans for years, but they were ignored.

Current estimates put the costs of building the subterranean railway station in Stuttgart, the capital of the southwestern German state of Baden-Württemberg, at €4.1 billion ($5.38 billion). An associated high-speed rail line to Ulm, a city lying about 90 kilometers (56 miles) southeast of Stuttgart, is slated to cost another €3 billion.

But what, you might ask, is the payoff for Deutsche Bahn, the federal government or the EU of implementing Stuttgart 21 and building the new line to Ulm? Deutsche Bahn CEO Rüdiger Grube offers one answer: The building project, he explains, will “eliminate the biggest bottleneck on the high-speed route from Paris to Bratislava.”

It would seem that Grube still doesn’t have his facts straight. It might help if he actually took the train from Paris to Bratislava. The roughly 13-hour trip would probably be enough to convince him that this so-called express corridor actually isn’t so express and that boring tunnels through the karst formations of the Swabian Alps mountain range for the Stuttgart-Ulm line is not about to make the connection significantly more attractive.

See map Paris to Bratislava

As Düsseldorf-based engineer Sven Andersen puts it, “Stuttgart 21 does nothing for long-distance travel.” Unlike Grube, Andersen has spent his entire career working in the railway industry, most recently as an expert on operational issues, and is considered one of the top experts on Germany’s railway system.

New Stuttgart station

As Andersen sees it, Stuttgart 21 and the related plan to built the Stuttgart-Ulm high-speed railway line are “a transportation-policy disaster.” Likewise, he adds, the project seems to be based on a complete misunderstanding of Stuttgart’s role in the German and European railway network. “Stuttgart is a destination,” he says. “It’s not a place people travel through to get someplace else. Converting the station into a through station won’t be an improvement on any significant route.” Indeed, all you have to do is look at a map to realize that Stuttgart is not a central location. All fast connections between key economic zones pass through other cities. For example, the Frankfurt-Zurich route runs far west of Stuttgart through Karlsruhe and Basel, while the Frankfurt-Munich route makes a wide arch through Würzburg and Nuremberg, far north and east of Stuttgart.

Full article:

http://www.spiegel.de/international/germany/0,1518,717575,00.html

“You left spacedock without a tractor beam?”: Mysterious force holds back NASA probes

September 19, 2010

Star Trek Generations

Star Trek Generations:

Kirk: You left spacedock without a tractor beam?
Harriman: It doesn’t arrive until Tuesday.

The Telegraph:

A space probe launched 30 years ago has come under the influence of a mysterious force that has baffled scientists and could rewrite the laws of physics. Researchers say Pioneer 10, which took the first close-up pictures of Jupiter before leaving our solar system in 1983, is being pulled back to the sun by an unknown force. The effect shows no sign of getting weaker as the spacecraft travels deeper into space, and scientists are considering the possibility that the probe has revealed a new force of nature.

Tractorbeam arriving on Tuesday

“If the effect is real, it will have a big impact on cosmology and spacecraft navigation,” said Dr Laing, of the Aerospace Corporation of California. Pioneer 10 was launched by Nasa on March 2 1972, and with Pioneer 11, its twin, revolutionised astronomy with detailed images of Jupiter and Saturn. In June 1983, Pioneer 10 passed Pluto, the most distant planet in our solar system.

pt:Trajectória da sonda Pioneer 10 em Jupiter

Pioneer 10 trajectory

Research to be published shortly in The Physical Review, a leading physics journal, will show that the speed of the two probes is being changed by about 6 mph per century – a barely-perceptible effect about 10 billion times weaker than gravity.

Assertions by some scientists that the force is due to a quirk in the Pioneer probes have also been discounted by the discovery that the effect seems to be affecting Galileo and Ulysses, two other space probes still in the solar system. Data from these two probes suggests the force is of the same strength as that found for the Pioneers.

Dr Duncan Steel, a space scientist at Salford University, says even such a weak force could have huge effects on a cosmic scale. “It might alter the number of comets that come towards us over millions of years, which would have consequences for life on Earth. It also raises the question of whether we know enough about the law of gravity.”

Tata Nano +: Tata Motors riding high

September 19, 2010

Tata Nano +

Tata Motors is to woo Indian small car budget customers by launching an upgraded version of its Nano, the new Tata Nano Plus sometime in 2011. Tata Motors plans a Nano Plus with upgraded features. The Nano+ for the Indian market is expected to be similar to the Nano Europa.

Tata Nano was launched to woo the Indian automobile customers with its Rs. 1 lac (2200$) price tag but has failed to live upto the initial hype because of technical problems and issues of delayed delivery. The new Tata Nano + will include a more powerful 1000 cc engine instead of the older 623 cc engine. It will also include ABS, alloy wheels, integrated music systems and improved interiors. The car will be on the lines of Nano Europa and will compete with Maruti Alto and Chevrolet Spark. Delivery of the car  should not be a problem as the new Sanand plant increases production.

Tata Motors’ global vehicles sales rose 29% to 85,411 units in August 2010 over August 2009. The global sales include figures of its British luxury unit Jaguar Land Rover, whose sales rose 29% to 16,220 units in August 2010 over August 2009.

Tata CNG hybrid bus

Tata CNG hybrid bus

Tata Motors are also bringing out India’s first CNG-Electric hybrid public transport bus. It can accommodate 32 people, uses a parallel hybrid system and has a top speed of 72 kph.

Tata Motors has reported a growth of 29 percent in August. The entire sales of Tata’s vehicles totaled to 85,114 units in August 2010, a growth of 29 percent over August 2009. This has taken the cumulative sales for the fiscal year (April 2010 – August 2010) to 424,938, higher by 42 percent compared to the corresponding period in 2009-10. Sales of all commercial vehicles were 40,882 last month, a growth of 25 percent, taking the cumulative sales to 192,612, a growth of 35 percent.

2009 Jaguar XF photographed at the 2008 Washin...

Jaguar XF

Sales of all passenger vehicles were 44,232 in the month, a growth of 33 percent and the corresponding cumulative sales are 232,326, a growth of 49 percent. Tata passenger vehicle sales, including those distributed, were 28,012 for the month, a growth of 35 percent with a cumulative increase of 50 percent. Jaguar Land Rover global sales in August 2010 were 16,220 vehicles, higher by 29 percent. Jaguar sales for August 2010 were 3,788, higher by 33 percent, while Land Rover sales were 12,432, higher by 28 percent. Cumulative sales of Jaguar Land Rover for the fiscal are 92,759, higher by 46 percent. Cumulative sales of Jaguar are 24,919, higher by 31 percent, while cumulative sales of Land Rover are 67,840, higher by 52 percent.

Tata Motors is planning to launch new models with its Venture MPV and Aria Crossover in the near future.

Son of Hubble — getting expensive

September 18, 2010

Successor to Hubble, the James Webb Space Telescope is now slated for launch in 2014. The $5 billion mission is once again plagued by cost overruns.

Science News reports:

How can astronomers advise NASA on how to trim the costs of developing missions if no one will tell them how much the costliest mission of all, the James Webb Space Telescope, is running over budget?

That’s what Alan Boss, chair of the independent NASA Astrophysics Subcommittee, would like to know. When the subcommittee met in Washington, D.C., on September 16 and 17, Boss and his colleagues already knew that the $5 billion infrared space observatory, the Hubble Space Telescope’s successor now set for launch in 2014, was once again in need of a monetary transfusion.

What Boss wanted to know was how much. But no one in room 3H46 at NASA headquarters was willing to talk dollars and sense — when Boss, an astronomer at the Carnegie Institution for Science in Washington, D.C., asked if anyone in the room could cite a dollar figure, his question was met with a silence as deep as any in the vast empty reaches of intergalactic space.

Fear of making a huge and embarrassing error like the one that produced Hubble Space Telescope’s infamously misshapen primary mirror may be causing JWST scientists and engineers to go overboard and do too much testing, Weiler said. The comprehensive report on JWST due next month, led by John Casani of NASA’s Jet Propulsion Laboratory in Pasadena, Calif., will cite instances where engineers on the mission may be overzealous in testing equipment.

JWST gobbles up about 40 percent of NASA’s astrophysics science budget.

Future of flight …

September 17, 2010

Der Spiegel:  What will air travel look like in the year 2050? A special team of engineers from European aircraft manufacturer Airbus have drafted plans for the future of flight. These include a completely transparent fuselage that will allow passengers to the see the stars above and city lights below.

“Passengers in an airplane like this would experience flight in a completely new way,” enthuses Axel Krein, 49, head of research and technology for European aircraft manufacturer Airbus. The unconventional idea came from the special team that Krein himself put together to forge ideas for the airplanes of the future.

“We told our engineers to give their imaginations free rein,” Krein explains. “What emerged were completely realistic visions of flight in the year 2050. Our people are grounded in reality, after all. And most of the necessary technology already exists.”

Finnair solar powered helicopter

Finnair 450 bed space hotel

Finnair's flying saucer. In 2093 with 2400 passengers

Lockheed Martin Corporation supersonic jet

Finnair supersonic jet

'Concept Plane' by Airbus 2050

Fast wireless charging from Fujitsu

September 17, 2010
Faraday's experiment with induction between co...

Image via Wikipedia: Faradays induction experiment

This will be a significant step forward but if only I could also charge my laptop wirelessly as well………

From Asahi News:

Kawasaki, Japan, September 13, 2010 — Fujitsu Laboratories Limited today announced the development of wireless recharging technology that enables the design of magnetic resonance-based wireless charging systems that can simultaneously recharge various types of portable electronic devices. Details of this technology are being presented at the 2010 conference of the Institute of Electronics, Information and Communication Engineers (IEICE), opening September 14 at Osaka Prefecture University.

Electromagnetic induction and magnetic resonance are the methods most often used for wireless charging. With electromagnetic induction, a magnetic flux is induced between the power-transmitting and power-receiving coils, and operates based on electromotive force. This method has been used in cordless phones, among other equipment. The drawbacks are that the method only works over short distances, and the power transmitter and power receiver need to be in alignment, so it is effectively no different than using a charging station with a wired connection. By contrast, the magnetic resonance method, which was first proposed in 2006, uses a coil and capacitor as a resonator, transmitting electricity through the magnetic resonance between the power transmitter and power receiver. This method can transmit electricity over a range of up to several meters, and because a single transmitter can power multiple receiving devices, developments are under way for a broad range of potential applications, charging everything from portable electronics to electric cars.

What Fujitsu Laboratories has done is to develop technology that dramatically shortens the time required to design transmitters and receivers for magnetic resonance charging systems and, in addition, enables accurate tuning of resonant conditions in the design phase, even for compact transmitters and receivers that are prone to influences from nearby metallic and magnetic objects.

Fujitsu plans to continue using this analysis and design technology in research and development on wireless charging systems for mobile phones and other portable devices, and plans to bring products using it to market in 2012. The company is also looking at applying the results of this work to fields other than portable electronics, including power transmission between circuit boards or computer chips, and providing mobile charging systems for electric cars.

BP Oil Plume was only 1/3 oil, 2/3 was gas

September 17, 2010
Gas from the damaged Deepwater Horizon wellhea...

Image via Wikipedia

Perhaps this helps to explain where all the oil went.

The plumes of oil that spewed into the Gulf of Mexico’s depths this spring and summer in the aftermath of the BP Deepwater Horizon blowout were actually only about one-third oil,  with the remainder consisting of natural gas.

Research reported online September 16 in Science found that in June, marine microbes were primarily feeding on propane and ethane in the oil plumes. “We estimate that there’s about two times as much gas sitting in those subsurface plumes as there is oil — and there’s about a million barrels of oil in them,” says David Valentine of the University of California, Santa Barbara, speaking by phone from a National Oceanic and Atmospheric Administrationresearch vessel in the Gulf. Chemists had been trying to estimate how much oxygen might disappear as microbes began degrading BP’s spilled oil. It now turns out oil is only a tiny part of the issue. “Probably 66 to 75 percent of the oxygen loss — maybe even a bit more — will ultimately come from bacterial metabolism of the gases,” Valentine projects.

https://ktwop.wordpress.com/2010/09/12/microbes-ate-the-bp-oil-plume/

The new research “is quite solid and something people will be taking seriously,” says Benjamin Van Mooy, a chemical oceanographer at the Woods Hole Oceanographic Institution in Massachusetts.

Terry Hazen of Lawrence Berkeley National Laboratory in California and his colleagues recently reported finding substantial microbial degradation of a particular fraction of the spilled oil called n-alkanes in subsea plumes. He says that the work by the Woods Hole team and the authors of the new Science paper doesn’t contradict his group’s findings. “They’re all quite consistent,” he says. Each group looked at different hydrocarbons at different times, and sometimes in different plumes. The environment is dynamic, he notes, and truly understanding what’s happening will take a lot more work.

One big concern since the initial discovery of deep-sea hydrocarbon plumes has been what will happen to oxygen concentrations near the seabed. Some scientists have questioned whether fish-suffocating dead zones might develop. But a September 7 federal study looked for evidence of such oxygen deprivation in plume zones and found none.

Based on four months of sampling data through August 9, “Oxygen levels have dropped by about 20 percent from their long-term average in this area of the Gulf,” said Steve Murawski, chief science advisor to NOAA’s Fisheries Department and head of the largely federal interagency Joint Analysis Group on the BP spill. Oxygen levels in plume zones have stabilized, he said, and “would have to decrease another 70 percent in order to be classified a dead zone.”

PR2 robot now on general sale

September 16, 2010

Asimov’s Laws of Robotics are not yet being put to the test but the PR2 robot’s two gripper-equipped arms, laser scanner and multiple cameras allow it to fold towelsfetch a beer and plug itself into the mains when it needs to recharge.

The New Scientist reports that Silicon Valley start-up Willow Garage has put its PR2 robot on general sale.

The price tag may be a bit daunting at $400,000.

But for individuals with a proven track record in contributions to the open source community, we are also introducing an award which amounts to a $120,000 discount on PR2 purchases. Details on the open source discount are here.

image:http://www.willowgarage.com/blog/2010/09/07/pr2-pricing-and-open-source-discount

Video here

Renewable Realities

September 16, 2010
Modern wind energy plant in rural scenery.

Image via Wikipedia

Renewable energy sources – when they have become commercial – have their part to play. Engineers and scientists have made remarkable progress in the development of concepts, materials, systems and technologies. But the exaggerations and distortions regarding the possibilities follow a political agenda. Fundamentals and common sense are discarded in the fervour – almost religious – of “environmentalism” and “global warming” and subsidy scams. The realities of what renewables can offer is far from the rosy perceptions that prevail.

It is worth just reminding ourselves of the fundamental constraints which apply:

Generating Capacity: Wind and solar capacity require full back-up capacity but hydro power does not.

  • Wind power is intermittent and cannot be predicted. Therefore generating capacity needs cannot rely on wind power capacity and 100% back-up in the form of alternate capacity is always needed. Since electrical power cannot be stored, wind power cannot follow load needs. Any variation in wind power produced must be compensated for by changing the power generated by some other plant in order to follow load. Wind power cannot be despatched.
  • Solar power (thermal or photovoltaic) is intermittent not only between day and night and between winter and summer but also during the hours of sunshine due to clouds, rain and dust storms. Some little storage of thermal energy (molten salts for example) is possible but storage of electrical power in batteries or the like is not
    Solar Array récupéré de http://en.wikipedia.or...

    Image via Wikipedia

    feasible.

    Solar plant capacity must also be backed up by alternate generating capacity and since this falls to zero every night, the back up required is also around 100% (with some variation due to the particular night time load profile). Because thermal storage can be available some load changing during daylight hours is feasible.

  • Hybrid solar thermal – fossil fuel plants can ensure continuous operation and eliminate the back up capacity.
  • The lifetime of components in a solar thermal plant is drastically affected by the enforced cycling caused by daily starts and stops. (Material fatigue and creep considerations are determined by thermal cycling).
  • Hydro power plants are dependent upon seasonal water levels in reservoirs for large plant or on variations of water flow in smaller run-of-the-river plants. Large plants are nearly always used for base load power (when in-season) and can also be used for power storage of surplus power from other plants if equipped with a pumped-storage facility. Hydro power plants are always included within the generating capacity base and require no back up capacity. However a grid’s load changing needs (to follow load) must usually be provided for by other types of plant (gas or coal).

Availability and capacity factor:

  • Wind power is available only when the wind blows above a minimum value (around 4 m/s) and below a maximum value (around 25 m/s). It cannot operate in gusting conditions. For safety considerations ice formation on turbine blades must be avoided and this gives a minimum ambient temperature for operation as well. Though wind turbine machinery may be available to operate for over 90% of time, the wind or weather conditions are the limiting factor and a wind turbine – dependent on siting – can usually generate power for not more than about 40 -50%  of a year. But it is not possible to predict when it will be in operation and at what load. The resultant capacity factor for a wind turbine is around 20% (i.e. a wind power plant only generates about 20% of its rated capacity on an annual basis).
  • Solar thermal plants  without storage can operate for about an annual average of 8 -9 hours per day. With thermal storage they can operate for about 14 or 15 hours per day and where the solar field is used to augment a fossil fuel plant continuous operation is possible. Without storage, a solar thermal plant has a capacity factor of around 20% which can be increased with thermal storage to about 40%. Currently the cost of thermal storage adds about 75% to the cost of a solar thermal plant.
  • Solar photovoltaic plants cannot use any form of energy storage and therefore have a capacity factor of around 20%
  • Large hydro plants running at base-load have capacity factors well above 80% (in-season).
  • Small run-of-the-river hydro plants can have capacity factors ranging from 30% in seasonal flows and over 80% in perennial flows.