PEN American Center has announced that acclaimed naturalist and two-time Pulitzer Prize winner Dr. Edward O. Wilson and actor and conservationist Harrison Ford have joined with PEN to create the PEN/E.O. Wilson Award for Literary Science Writing.
The $10,000 award which will be awarded to one non-fiction book each year is designed to “acknowledge new and compelling literary writing about the physical and biological sciences. Beginning in 2011, a winner will be chosen by PEN American Center each fall, and will receive a prize of $10,000. Dr. Wilson and Mr. Ford have provided funding for the first three years of the award’s conferral.”
Looks innocuous enough doesn’t it? Just another set of directions provided by Google Maps.
Not as far as Lauren Rosenberg is concerned.
SearchEngineLand reported Friday that Rosenberg is suing Google after she suffered an accident because she following the walking directions Google provided:
Rosenberg used Google Maps on January 19, 2009, via her Blackberry, to get directions between 96 Daly Street, Park City, Utah and 1710 Prospector Avenue, Park City, Utah. Google provided these, telling her as part of the route to walk for about 1/2 mile along the calm-sounding “Deer Valley Drive.”
That’s an alternative name for that section of Utah State Route 224, a highway that lacks sidewalks, the case says. Rosenberg wasn’t warned about this, putting Google directly at fault in the accident, the case claims:
Defendant Google, through its “Google Maps” service provided Plaintiff Lauren Rosenberg with walking directions that led her out onto Deer valley Drive, a.k.a. State Route 224, a rural highway wit no sidewalks, and a roadway that exhibits motor vehicles traveling at high speeds, that is not reasonably safe for pedestrians.
The Defendant Google expects uses of the walking map site to rely on the accuracy of the walking directions given….
As a direct and proximate cause of Defendant Google’s careless, reckless, and negligent providing of unsafe directions, Plaintiff Laren Rosenberg was led onto a dangerous highway, and was thereby stricken by a motor vehicle…
So, should Google be liable? Or did Rosenberg leave her common sense at the door when she decided to walk along a highway that has no sidewalk?
Check out the full story here.
AFMs are expensive pieces of kit however –way beyond the means of most private individuals.
Another technology that fascinates me is the 3-D printer, which is used for rapid design prototyping in three dimensions. 3-D printers are really coming into their own in recent years with the cheapest models starting to become affordable for private buyers and the concept of desktop manufacturing starting to take off.
So, you can imagine my delight when I stumbled across this tutorial that shows how to build a cheap AFM head using a 3-D printer.
As the author explains:
As the acquisition cost for commercially available AFMs is in the order of some hundred thousand dollars, this is an approach to make these instruments available to more research groups. Most of the structure can be made with rapid prototyping mehods, all that is left to do is to screw together the pieces. Nevertheless the user is supposed to have some experience with the matter as he doesn’t get the support that comes with a commercial instrument.
While I won’t be making an AFM anytime soon –I lack the time and expertise to do all but dream– it’s great to see the DIY spirit entering the world of high-tech microscopy.
BTW, check out the fabbaloocious Fabbaloo blog for regularly updated news about the world of 3-D printing. Is it inconceiveable that in the future we will be able to print out new limbs for people using their personal genetic code?
A pan-European team of robotics researchers began a project this year that could see humanoid bots interact with groups of people in a realistic, anthropomorphic way, for the first time.
The “Humanoids with auditory and visual abilities in populated spaces” (HUMAVIPS) project has the ambitious goal of making humanoid bots just that bit more human by building algorithms that will enable bots to mimic what psychologists call the “cocktail party effect” -– the human ability to focus attention on just one person in the midst of other people, voices and background noise.
If successful, HUMAVIPS will give future humanoid bots something that existing bots don’t possess -– the simple social skills necessary to deal with small groups of people, including the basic intelligence to pick out a group of humans and determine which ones want to interact with it. It could also endow bots with the ability to infer meaning from incoming sense data, which would be a rudimentary step towards truly anthropomorphic robot intelligence.
Led by Radu Horaud, Director of Research at INRIA, the three-year project, which has attracted 2.6m euros in European Commission funding, builds on the POP project (see Wired’s December report), which provided proof-of-concept for the idea that combining auditory and visual information improves a bot’s ability to pick identify human speakers in the midst of background noise.
Read more about HUMAVIPS here.
Another recent story for Wired:
Robots of the future will be capable of learning more complex behaviours than ever before if a new, pan-European research project succeeds in its goal of developing the world’s first architecture for advanced robotic motor skills.
If successful, the four-year AMARSi (Adaptive Modular Architecture for Rich Motor Skills) project (which started this month) could see a manufacturing world filled with autonomous, intelligent humanoid worker bots that can learn new skills by interacting with their co-workers. It could also see a society with personal carer bots capable of quickly adapting to complex environments and changing human needs.
If the researchers are successful, the 7 million euro, EU-funded project will enable humanoid (and quadruped) bots to autonomously learn and develop motor skills in open-ended environments in the same way humans do — by learning from the data provided by movement and essentially rewiring their circuits to process and store the new knowledge they’ve acquired.
It’s all a far cry from the limited learning and motor skills capabilities of existing bots and it will rely on a suitably advanced range of technologies to make it happen: dynamic neural networks built on reservoir computing principles, new robotics hardware designs, and sophisticated software algorithms are all involved.
AMARSi relies on a “more-or-less unusual,” biologically inspired view of motor skills that goes beyond traditional robotic designs and is better suited to truly autonomous robots, says Project Coordinator, Jochen Steil, Director of The Cognitive Robotics and Learning Laboratory (CoR-Lab), at Bielefield University, in Germany.
Read more about the AMARSi project here.