Tuesday, 30 October 2012

NASA Seeks Student Experiments For 2013 High-Altitude Scientific Balloon Flight

NASA Seeks Student Experiments For 2013 High-Altitude Scientific Balloon Flight 
 WASHINGTON -- NASA is accepting applications from graduate and undergraduate university students to fly experiments to the edge of space on a scientific balloon next year. The balloon competition is a joint project between NASA and the Louisiana Space Consortium (LaSPACE) in Baton Rouge.
NASA is targeting fall 2013 for the next flight opportunity for the High Altitude Student Platform (HASP). HASP is a balloon-borne instrument stack that provides an annual near-space flight opportunity for 12 instruments built by students.
A panel of experts from NASA's Wallops Flight Facility in Virginia and LaSPACE will review the applications and select the finalists for the next flight opportunity. Flights are launched from the Columbia Scientific Balloon Facility's remote site in Fort Sumner, N.M., and typically achieve 15 to 20 hours' duration at an altitude of about 23 miles.
HASP houses and provides power, mechanical support, interactivity and communications for the instruments. It can be used to flight-test compact satellites, prototypes and other small payloads designed and built by students. HASP can support about 200 pounds for payloads and test articles. Since 2006, the HASP program has flown 60 payloads involving more than 500 students from 14 states, Puerto Rico and Canada.
The deadline for applications for the 2013 flight is Dec. 14. A question-and-answer teleconference for interested parties will be held Nov. 16.

India exceeds target for terrestrial protected areas

India exceeds target for terrestrial protected areas
 India has already exceeded the target set in conserving terrestrial protected areas under the 2020 Aichi Biodiversity Targets at CoP-10 two years ago. Under the Aichi Targets, it was agreed that at least 17 per cent of the world’s terrestrial areas and 10 per cent of marine areas would be equitably managed and conserved.
National Biodiversity Authority (NBA) Chairman P. Balakrishna said that the country was currently conserving 20 per cent of the terrestrial protected areas as against the target of at least 17 per cent. He was talking to reporters on Thursday after the release of United Nations Environment Programme (UNEP) report, which tracked the progress of internationally-agreed targets on the world’s protected areas.
The ‘Protected Planet Report 2012’ said half of the world’s richest biodiversity zones remain entirely unprotected in spite of a 60 per cent increase in protected areas since 1990.Despite the growing number of nature reserves, national parks and protected areas around the globe, the current investment in protected areas was only around half of what was needed to support endangered species, protect threatened habitats and deliver full benefits that they could deliver.
The report stated that poor management, under-funding and a lack of critical data on protected areas meant that the world was making insufficient progress towards the 2020 goals.
According to recent figures, just 12 per cent of the world’s terrestrial areas were thought to be protected. To meet the CBD target of 17 per cent, an additional six million square kms of land and inland waters have to be recognised as protected by world governments. Overall, protection was higher in developing regions (13 per cent of total area) than in developed regions (11.6 per cent). Around 1.6 per cent of the global ocean area was protected and to meet the target of 10 per cent, an additional eight million square km of marine and coastal areas would need to be recognised as protected areas, an area just over the size of Australia.
 
 
Article No.2
Indian cities rank low on ‘most prosperous’ list: U.N. report
 New Delhi and Mumbai figure low on the list of prosperous cities across the globe, but have the potential to make it to the top rung, says a United Nation’s report.
Released in the city on Wednesday, the State of the World’s Cities report by the U.N. Habitat ranks New Delhi at 58 and while Mumbai has been placed at 52 among 95 cities. The reasons for Indian cities being ranked low is the poor status of development indicators like infrastructure, environmental conditions and avenues for employment.
Commenting on India’s ranking in the global list, Eduardo Lopez Moreno, chief author of the report, said New Delhi and Mumbai can be described as cities with “medium” performance. He blamed low productivity, and incapability of the cities to generate jobs and encourage trade and investment for their poor performance.
These cities are halfway to prosperity, but both have been penalised for poor environmental conditions, pollution etc. They still need to improve in all dimensions such as productivity, quality of life, adequate infrastructure, equity and environmental sustainability,” Mr. Moreno said.
The report classifies cities into groups, based on their performance, while India is in Group IV, countries in Europe and North America are in Group I, which has countries demonstrating better integrity of all dimensions (productivity, quality of life, adequate infrastructure, equity and environmental sustainability) and high volume of good services and economic fundamentals, Mr. Moreno pointed out.
Though the report focused primarily on New Delhi and Mumbai, it does mention the achievements in the IT sector in cities such as Bangalore, and Hyderabad’s position as the pharmaceutical capital of India.
On information and communications technology in Asian cities, the report says Delhi, Mumbai, Kolkata and Chennai feature mobile telephone connection rates of 138, 112, 102 and 143 per cent respectively.

BHEL collaborates with ISRO

BHEL collaborates with ISRO
Bhopal: Bharat Heavy Electricals Ltd (BHEL) along with Indian Space Research Organization (ISRO), has set up a facility at its Electronics Systems Division for assembling and testing space grade solar panels using high efficiency solar cells.
“Also supplied were two sets of Li-ion batteries, each consisting of 40 Li-ion cells with a power storage capacity of 160 Ampere-Hours for GSAT-10,” BHEL said in a statement. Both the space solar panels and batteries were built and tested to strict space quality standards at the BHEL facility by highly skilled manpower, it added.
Earlier this year, space grade solar panels and Lithium-ion batteries assembled, tested and supplied by the electronics division of city-based BHEL to the ISRO were successfully deployed recently on GSAT-10 Satellite.
     For further information visit: http://www.epcworld.in/epcnews/bhel-collaborates-with-isro.aspx

GIS NEWS:

 

Article No.1
TerraGo brings collaboration to non-GIS experts
TerraGo Technologies, maintained by Carahsoft, recently previewed their latest release Mobile for Android and announced the acquisition of Geosemble, which brings into the fold GeoXray.
Parent company Carahsoft is a government IT solutions provider providing software solutiosn for federal, state and local government agencies. Under the company umbrella are solutions from not only TerraGo, but Adobe and other geospatial intelligence solutions.
Jim Sheen, vice president of products and services for TerraGo , Jessica Sunday, technical account manager and Nathan Jones, vice president of engineering spoke about the recent news in a webinar. TerraGo’s claim to fame is its unique GeoPDF format, which allows for geospatial information to be accessed and displayed in a PDF format. TerraGo covers collaboration and workflows for deploying GeoPDFs for maps and imagery.
For the enterprise, TerraGo provides a suite of applications that help both small and large enterprises and Fortune 100 companies to produce access and share geospatial information with anyone, anywhere. These applications are for those who are not GIS experts and don’t have access to GIS software, as well as those who do.
Round trip workflows can be designed with TerraGo that travel from the enterprise to the outside edge of the enterprise. The upcoming TerraGo V6 serves as a platform for geospatial applications and for moving spatially aware information among different users and systems throughout the workflow.“Field users don’t have to be GIS experts, so that users represent a wide range of different skill sets,” said Sheen. “We make solutions as simple to use and economical as possible. They can be used online and in a disconnected offline way.”
This is very important when communications are unavailable and in military communications.“TerraGo Publisher plugs into your existing GIS to deploy the geospatial assets as GeoPDF maps and imagery,” said Sheen. “So it allows you to take the complexity built into your maps and imagery in the GIS system, simplify it and make it interactive and portable, so it can be used downstream collaboratively.”
PDF maps and imagery can be further extended using TerraGo Composer, to build and configure different types of geospatial apps, for example, GeoPDF map books or digital atlases that can be deployed to the field with either the TerraGo Toolbar or TerraGo Mobile. Toolbar and Mobile enable end users to interact with the maps and imagery, gather on-the-ground intelligence and collaborate with other users. Once that’s done, in some cases, the end result for the customer is to get data out to the field where the remote workers can collaborate with one another. The field data that has been updated can be entered into the enterprise GIS.
The new version 6 to be available in a couple of months will contain TerraGo Publisher for ArcGIS, Composer for Acrobat and TerraGo Toolbar. New enhancements in annotation and geomarking have been added to Toolbar and Composer so end users can use Adobe Reader with Toolbar to add, edit, annotate and add geomarks on any PDF produced directly in the TerraGo system. As you create GeoPDFs they become immediately available.
Geoforms are data entry forms that can be attached and georeferenced to geomarks and annotations. Those forms can be distributed to field workers for field data collection and real time sharing and that data can also be reconsolidated into the enterprise GIS.
In summary, TerraGo has been well positioned to move into the mobile and non-GIS expert market, making GIS and geospatial accessible to a broader number of users by extending the reach of GeoPDF. It will be interesting to see where the company goes with the new offerings. With its simple but elegant link to Adobe PDF, coupled with the recent acquisition of Geosemble for data mining, the possibilities look endless.
Article No.2
Geospatial Solutions Aid in Recovery of Devastating Waldo Canyon Fires in Colorado Springs
This summer, Colorado experienced one of the most devastating fires in its history. The blaze decimated more than 18,000 acres of forest in Waldo Canyon and the Pike National Forest near Colorado Springs.  As a result, more than 32,000 residents were evacuated and 346 homes were subsequently destroyed.
After a natural disaster, first responders and other decision makers need geospatial data that are actionable, as well as easy to manage and understand to effectively save lives and assess damage.  In addition to needing information quickly, the geospatial data must be able to be used by personnel who do not have GIS tools or training.
For the recent Waldo Canyon fires, geospatial solutions from TerraGo Technologies and Colorado-based DigitalGlobe played a key role in aiding disaster recovery, emergency planning and change detection within the area’s most impacted by the blaze.
 
Using DigitalGlobe’s WorldView-2 satellite imagery, the analyst turned to Esri ArcGIS® desktop to create maps that were then outputted in TerraGo GeoPDFÃ’ maps, produced by TerraGo Publisher® and compiled into map books by TerraGo Composer® for Adobe® Acrobat®.  The GeoPDF maps can be downloaded here.
Natural disasters are unpredictable and having current intelligence enables emergency personnel to adapt on the fly as conditions change to save lives.  We are gratified that our software and solutions aided first responders and other emergency personnel with the intelligence they needed for effective, efficient decision-making in the wake of this disaster.

Arctic ice could vanish within 10 years: Scientists

Arctic ice could vanish within 10 years: Scientists
AP The entire region could be eventually free of ice if the estimates prove accurate. This would trigger a ‘gold rush’ for oil reserves and fish stocks in the region.
Arctic sea ice could vanish within 10 years as it is melting much faster than previously believed, thanks to global warming, warn scientists, claiming that the process is 50 percent faster than the current estimates.
New satellites being operated by the European Space Agency paint a grim picture of 900 cubic km of ice already having melted over the last year.
This is 50 percent higher than the current estimates from environmentalists, they claim. It is suggested that the increase is down to global warming and rising greenhouse gas emissions, the Daily Mail reports.
The entire region could be eventually free of ice if the estimates prove accurate. This would trigger a ‘gold rush’ for oil reserves and fish stocks in the region.“Preliminary analysis of our data indicates that the rate of loss of sea ice volume in summer in the Arctic may be far larger than we had previously suspected,” said Seymour Laxon, of the Centre for Polar Observation and Modelling at University College London (UCL), where CryoSat-2 data is being analysed.
Scientists launched the CryoSat-2 probe in 2010 specifically to study ice thickness. Until then most studies had focused on the coverage of the ice. Submarines were also sent into the water to analyse the ice. The methods are said to have given a picture of changes in the ice around the North Pole since 2004.
Data from the exploration shows that in winter 2004, the volume of sea ice in the central Arctic was approximately 17,000 cubic km. This winter it was 14,000 km, according to CryoSat.
Chris Rapley, professor at UCL added: “Before CryoSat, we could see summer ice coverage was dropping markedly in the Arctic. But we only had glimpses of what was happening to ice thickness. Obviously if it was dropping as well, the loss of summer ice was even more significant.”

Science News - NASA STEREO observes 1 of the fastest CMEs on record

NASA STEREO observes 1 of the fastest CMEs on record
This image was captured by ESA and NASA's Solar and Heliospheric Observatory (SOHO) on July 22, 2012 at 10:48 p.m. EDT. On the right side, a cloud of solar material ejects from the sun in one of the fastest coronal mass ejections (CMEs) ever measured. Credit: Credit: ESA / NASA. 
On July 23, 2012, a massive cloud of solar material erupted off the sun's right side, zooming out into space, passing one of NASA's Solar TErrestrial RElations Observatory (STEREO) spacecraft along the way. Using the STEREO data, scientists at NASA's Goddard Space Flight Center in Greenbelt, Md. clocked this giant cloud, known as a coronal mass ejection, or CME, as travelling between 1,800 and 2,200 miles per second as it left the sun.
Conversations began to buzz and the emails to fly: this was the fastest CME ever observed by STEREO, which since its launch in 2006 has helped make CME speed measurements much more precise. Such an unusually strong bout of space weather gives scientists an opportunity to observe how these events affect the space around the sun, as well as to improve their understanding of what causes them.
"Between 1,800 and 2,200 miles per second puts it without question as one of the top five CMEs ever measured by any spacecraft," says solar scientist Alex Young at Goddard. "And if it's at the top of that velocity range it's probably the fastest."The STEREO mission consists of two spacecraft with orbits that for most of their journey give them views of the sun that cannot be had from Earth. Watching the sun from all sides helps improve our understanding of how events around the sun are connected, as well as gives us glimpses of activity we might not otherwise see.
On July 23, STEREO-A lay - from Earth's perspective - to the right side and a little behind the sun, the perfect place for seeing this CME, which would otherwise have been hard to measure from Earth. The Solar Heliospheric Observatory (SOHO), an ESA and NASA mission also observed the CME. It is the combination of observations from both missions that helps make scientists confident in the large velocities they measured for this event.
Measuring a CME at this speed, travelling in a direction safely away from Earth, represents a fantastic opportunity for researchers studying the sun's effects. Rebekah Evans is a space scientist working at Goddard's Space Weather Lab, which works to improve models that could someday be used to improve predictions of space weather and its effects. She says that the team categorizes CMEs for their research in terms of their speed, with the fastest ones - such as this one - labelled "ER" for Extremely Rare.
"Seeing a CME this fast really is so unusual," says Evans. "And now we have this great chance to study this powerful space weather, to better understand what causes these great explosions, and to improve our models to incorporate what happens during events as rare as these."
Orbiting the sun some 89,000,000 miles away, STEREO-A could observe the speed of the CME as it burst from the sun, and it provided even more data some 17 hours later as the CME physically swept by - having slowed down by then to about 750 miles per second. STEREO has instruments to measure the magnetic field strength, which in this case was four times as strong as the most common CMEs.
When a CME with strong magnetic fields arrives near Earth, it can cause something called a geomagnetic storm that disrupts Earth's own magnetic environment and can potentially affect satellite operations or in worst-case scenarios induce electric currents in the ground that can affect power grids.
"We measure magnetic fields in 'Tesla' and this CME was 80 nanoTesla," says Antti Pulkkinen, who is also a space weather scientist at Goddard. "This magnetic field is substantially larger even than the CMEs that caused large geomagnetic storms near Earth in October 2003. We call those storms the Halloween storms and scientists still study them to this day."
While large, this measurement of the magnetic field is still smaller than one of the greatest space weather events on record, the Carrington Event of 1859, during which the magnetic fields at Earth measured 110 NanoTesla.
When the CME passes over one of the STEREO spacecraft, the instruments can also measure the direction in which the magnetic field points - a crucial data point since it is the southward pointing magnetic fields in a CME that travel in the opposite direction of Earth's own magnetic fields and thus can cause the most disruption. This CME traveled with an unusually large southward magnetic field of 40 NanoTesla that stayed steady for several hours.
The event also pushed a burst of fast protons out from the sun. The number of charged particles near STEREO jumped 100,000 times within an hour of the CME's start. When such bursts of solar particles invade Earth's magnetic field they are referred to as a solar radiation storm, and they can block high frequency radio communications as used, for example, by airline pilots.
Like the CME, this solar energetic particle (SEP) event is also the most intense ever measured by STEREO. While the CME was not directed toward Earth, the SEP did - at a much lower intensity than at STEREO - affect Earth as well, offering scientists a chance to study how such events can widen so dramatically as they travel through space.
Evans points out that all of this solar activity was produced by a specific active region that NASA's space weather scientists had been watching for three weeks before the super fast eruption on July 23.
"That active region was called AR 1520, and it produced four fairly fast CME's in Earth's direction before it rotated out of sight off the right limb of the sun," says Evans. "So even though the region had released multiple CMEs and even had an X-class flare, its strength kept increasing over time to eventually produce this giant explosion. To try to understand how that change happens makes for very exciting research."
STEREO is but one of several missions that observe the sun constantly, and the data is always interesting as there is much to be learned from observing the quiet sun as well as an active one. But the sun displays an activity cycle during which it gets more active approximately every 11 years as it heads toward what's called "solar maximum."The next solar maximum is currently predicted for 2013. We can expect more and more space weather events until then, and each one will help scientists better understand the sun and how its effects can permeate the entire solar system.

Science News - Mars rover takes 'cool' detour: NASA

Mars rover takes 'cool' detour: NASA
The US space agency NASA's Mars rover Curiosity will make a wide detour to explore a "cool" geographical hot spot on Mars, scientists said Friday. The scientists also reported they found temperatures in the Red Planet's Gale Crater to be just above freezing, the first monitoring of Mars temperatures in three decades.
Before driving to its destination at Mount Sharp, which may contain traces of water, Curiosity will head in the opposite direction, to a spot NASA's Jet Propulsion Laboratory has dubbed Glenelg.The Pasadena lab said the geologically-rich area marks the intersection of three kinds of terrain 1,640 feet (500 meters) from the rover's landing site.
A light-colored patch of terrain in the region indicates to scientists "a kind of bedrock suitable for eventual drilling by Curiosity."A cluster of small craters may represent "an older or harder surface" and another spot features a patch of land resembling the rover's landing site, before the nuclear-powered apparatus "scoured away some of the surface," NASA said.
Scientists said they chose the name Glenelg because it is a palindrome -- a word read the same way backward and forward -- and the rover will need to travel back in the same direction to head toward Mount Sharp. The Glenelg trek will be the rover's first "moderate duration drive target," Mars Science Laboratory project scientist John Grotzinger told reporters, explaining the decision to risk travelling off the planned route.
"It looks cool," he said.
Grotzinger estimated the rover's journey will take between three weeks and two months to arrive at Glenelg, where it will stay for roughly a month, before heading to the base of Mount Sharp. Analysts have said it may be a full year before the remote-controlled rover gets to the base of the peak, which is believed to be within a dozen miles (20 kilometers) of the rover's landing site.
A photo of the lower reaches of Mount Sharp, taken from Curiosity's landing site, shows "hills, buttes, mesas and canyons on the scale of one-to-three-story buildings."Scientists hope the hydrated minerals thought to be concentrated in the bottom half of the photographed lower reaches will "reveal the area's geological history."
The Mars Science Laboratory is expected to travel as far as halfway up Mount Sharp, a towering three-mile Martian mountain with sediment layers that may be up to a billion years old.NASA plans to obtain photos of the summit "in a week or two."Grotzinger noted the team's report on the Martian crater's temperature was "really an important benchmark for Mars Science."
"It's been exactly 30 years since the last long duration monitoring weather station was present on Mars," when Viking 1 stopped communicating with Earth in 1982," he said. The $2.5 billion rover arrived on Mars at 0531 GMT on August 6.