Forget moving mountains. Marine biologist Andrew Perry moves icebergs. And his latest adventure led to the discovery of an icy archway, right in the middle of the ocean.
Perry was out trawling for icebergs with Oceans Limited, a Canadian company that identifies which of the tremendous floaters are drifting towards stationary deep-water oil rigs, when he found the arch -- think Stargate meets portal to Narnia.
"It was a beautiful day, hardly a wave on the water. And then there it was -- a big beautiful arch," Perry told FoxNews.com. "No one had seen anything like this. We thought it was amazing."
Icebergs routinely break off Greenland and float down the Labrador coast, Perry explained, a corridor he called "iceberg alley." Along the way, they post a direct threat to deep-water oil installations. Though they don't move particularly quickly -- typically one to four knots -- they've got enough bulk to do major damage if they hit anything, he explained.
"We recorded some upwards of 350,000 tons," Perry said. Oceans Limited moves smaller icebergs by training water cannons on them for hours. "That's for the smaller ones, we call them growlers," Perry told FoxNews.com. It's much cheaper to move the icebergs, even the very large ones, than to disconnect the oil rig and move it, he pointed out: Moving a rig costs millions, while operating a small boat costs about $25,000 per day.
So Perry's company either lassos the big boys with a single boat or corrals them with a net dragged between two boats. Icebergs don't move particularly fast, Perry explained, so changing their course can take quite a while, but they don't have to move too many each year.
"Depending on the ice season, they may have to tow 10 to 20 ... during the 2009 season we profiled around 60 icebergs to get computer generated 3D images," Perry said.
But he had never run into an iceberg like this one before.
Icebergs are often seen as just giant chunks of compressed water, not stunning works of natural art. Yet beautifully sculpted icebergs like the one Perry found are actually fairly common, thanks to the natural forces of the seas and the skies, explained Ted Scambos, lead scientist for the National Snow and Ice Data Center at the University of Colorado in Boulder.
"Complex, sculptured icebergs like this are usually formed from ice that broke off of fast-flowing glaciers," Scambos told FoxNews.com. "It starts off as a rugged piece of ice that waves and sunshine then sculpt."
Sure, but how did this iceberg form in such a stunning fashion? Wave action, Scambos explained, and it's more common than you might think.
"As the waves begin to pound out a dimple in the ice facewall, it focuses the wave energy, leading to more rapid erosion at the center. So, with time, the waves carve through the face to the other side," Scambos told FoxNews.com. "It's not the first one I've seen, but it's the most artistic."
Icebergs are surprisingly noisy as well, according to Perry. They're constantly moving and cracking, he said. The arch "sounded like shotguns being fired off all the time, due to the ice cracking."
And what to do with all of that ice? Perry and his fellow biologists have a unique use for icebergs: They put them in cocktails.
"To be honest it's the cleanest water you can get. The air bubbles trapped in it are under so much pressure the ice fizzes when it melts."
"Who doesn't want 500,000-year-old ice in their drink?" he joked.
Tuesday, 28 December 2010
Thursday, 23 December 2010
Wednesday, 22 December 2010
Arcgis.com bigger than Flickr?
Ok, so ESRI confirmed that there were indeed 400 million maps created on arcgis.com in October, but I am still not convinced…
It is not that is impossible for ESRI to achieve such growth in one month, it’s that there just isn’t any (publicly accessible) evidence to suggest that the number is anywhere close. To illustrate my point, let’s compare Arcgis.com to another website with massive amounts of user-generated content. Since there is no such website in the GIS space, let’s take Flickr. I know that this is not “apples to apples” comparison, but since we are comparing only the amount of user-generated content it will work.
Here is what we know about Flickr: According to Compete, Flickr enjoyed almost 20 million unique visitors in October who uploaded more than 3000 images per minute. If we do a rough calculation (3000 photos * 1440 minutes * 31 days) we will arrive at 133,920,000 photos for October.
Now let’s look at Arcgis: Compete shows roughly 33K unique visitors for October and the numbers from Alexa are not much different. As I noted earlier, ESRI claims 400,000,000 user-generated maps for this period. This means that every unique visitor had to have created over 12K maps.
Either I am totally off in my calculations or Arcgis.com is 30% larger than Flickr in terms of user-generated content! This is excellent news for map geeks all over the world – making maps is finally more popular than uploading photos.
It is not that is impossible for ESRI to achieve such growth in one month, it’s that there just isn’t any (publicly accessible) evidence to suggest that the number is anywhere close. To illustrate my point, let’s compare Arcgis.com to another website with massive amounts of user-generated content. Since there is no such website in the GIS space, let’s take Flickr. I know that this is not “apples to apples” comparison, but since we are comparing only the amount of user-generated content it will work.
Here is what we know about Flickr: According to Compete, Flickr enjoyed almost 20 million unique visitors in October who uploaded more than 3000 images per minute. If we do a rough calculation (3000 photos * 1440 minutes * 31 days) we will arrive at 133,920,000 photos for October.
Now let’s look at Arcgis: Compete shows roughly 33K unique visitors for October and the numbers from Alexa are not much different. As I noted earlier, ESRI claims 400,000,000 user-generated maps for this period. This means that every unique visitor had to have created over 12K maps.
Either I am totally off in my calculations or Arcgis.com is 30% larger than Flickr in terms of user-generated content! This is excellent news for map geeks all over the world – making maps is finally more popular than uploading photos.
Tuesday, 21 December 2010
London Underwater Tube Map. London by the Sea
Wet commutes forecast as parts of London predicted to be underwater by the end of the century
Large areas of London could be underwater by 2100 as a direct result of climate change, research has shown.
Development charity Practical Action has released an alternative tube map that highlights the impact climate change and rising sea levels could have on the capital.
If climate change talks scheduled to begin in Cancun this week are not successful, it could lead to a 4C rise in global temperatures by the end of the century. This, in turn, could lead to a 4m rise in sea levels proving catastrophic for London and potentially devastating for developing countries.
The "London Underground Map 2100" highlights those areas that could be underwater if no action on climate change is taken including Westminster and the Houses of Parliament, London Bridge, Embankment, Sloane Square and Canary Wharf.
All of which would mean people would potentially face a swim rather than a walk to their jobs in the city and cause embarassment for the UK on the world stage and affect how London is perceived for business and finance.
Margaret Gardner, Director, Practical Action said "if no action is taken against the temperature and sea levels rise as predicted, large areas of London could be underwater by the end of the century - a frightening thought. But what's more frightening are the effects that will be felt in developing countries where people are already living on the front line of climate change and experiencing the worst effects of floods, droughts and extreme temperatures."
"In London we have an insurance industry and the necessary capital to do something about increased flooding risk. We can build barriers and do whatever is necessary. But in Dhaka and other cities in the developing world, there isn't the spare cash to just invest in infrastructure to help people to adapt to climate change. So the answer has to be to avoid climate change in the first place."
"Practical Action works extensively with communities living in these areas helping them to adapt to their changing climate but without action on climate change, the consequences will be too catastrophic to overcome."
Practical Action works with poor communities around the world helping them to adapt to the effects of climate change. From teaching Bangladeshi villagers to build floating gardens on flood waters in order to feed their families, to introducing camels in drought-prone regions of Kenya.
For more information and to sign up to Practical Action's climate change campaign 'Face up to 4C' please visit www.practicalaction.org/faceup.
For further information, please contact Abbie Upton, Practical Action Media Officer, on 01926 634510 or 07714 205342.
How London's Tube map could look in 2010 if climate change talks in Cancun are not successful, leading to a 4m rise in sea levels - click to open hi-res PDF version
Notes for Editors:
Practical Action believes that the right idea, however small, can change lives.
Practical Action is an international development charity with a difference, working together with some of the world's poorest women, men and children, helping to alleviate poverty in the developing world through the innovative use of technology.
Practical Action's particular strength is its 'simple' approach: finding out what people are doing and helping them to do it better. This enables poor communities to build their own knowledge and skills to produce sustainable and practical solutions: driving their own development.
Whether enabling women and men in Darfur to feed their families, providing people in Bangladesh with the chance to control the impact of flooding on their lives or working with remote communities in Peru to introduce electricity, Practical Action's activities are always people focused, locally relevant and environmentally sensitive, offering tangible ways out of poverty.
Practical Action won The Ashden Award for Light and Power in 2007 for its micro-hydro work in Peru, bringing electricity to over 30,000 people living in remote Andean villages.
*Source material here:
* http://legacy.london.gov.uk/mayor/strategies/sds/docs/regional-flood-risk09.pdf
Large areas of London could be underwater by 2100 as a direct result of climate change, research has shown.
Development charity Practical Action has released an alternative tube map that highlights the impact climate change and rising sea levels could have on the capital.
If climate change talks scheduled to begin in Cancun this week are not successful, it could lead to a 4C rise in global temperatures by the end of the century. This, in turn, could lead to a 4m rise in sea levels proving catastrophic for London and potentially devastating for developing countries.
The "London Underground Map 2100" highlights those areas that could be underwater if no action on climate change is taken including Westminster and the Houses of Parliament, London Bridge, Embankment, Sloane Square and Canary Wharf.
All of which would mean people would potentially face a swim rather than a walk to their jobs in the city and cause embarassment for the UK on the world stage and affect how London is perceived for business and finance.
Margaret Gardner, Director, Practical Action said "if no action is taken against the temperature and sea levels rise as predicted, large areas of London could be underwater by the end of the century - a frightening thought. But what's more frightening are the effects that will be felt in developing countries where people are already living on the front line of climate change and experiencing the worst effects of floods, droughts and extreme temperatures."
"In London we have an insurance industry and the necessary capital to do something about increased flooding risk. We can build barriers and do whatever is necessary. But in Dhaka and other cities in the developing world, there isn't the spare cash to just invest in infrastructure to help people to adapt to climate change. So the answer has to be to avoid climate change in the first place."
"Practical Action works extensively with communities living in these areas helping them to adapt to their changing climate but without action on climate change, the consequences will be too catastrophic to overcome."
Practical Action works with poor communities around the world helping them to adapt to the effects of climate change. From teaching Bangladeshi villagers to build floating gardens on flood waters in order to feed their families, to introducing camels in drought-prone regions of Kenya.
For more information and to sign up to Practical Action's climate change campaign 'Face up to 4C' please visit www.practicalaction.org/faceup.
For further information, please contact Abbie Upton, Practical Action Media Officer, on 01926 634510 or 07714 205342.
How London's Tube map could look in 2010 if climate change talks in Cancun are not successful, leading to a 4m rise in sea levels - click to open hi-res PDF version
Notes for Editors:
Practical Action believes that the right idea, however small, can change lives.
Practical Action is an international development charity with a difference, working together with some of the world's poorest women, men and children, helping to alleviate poverty in the developing world through the innovative use of technology.
Practical Action's particular strength is its 'simple' approach: finding out what people are doing and helping them to do it better. This enables poor communities to build their own knowledge and skills to produce sustainable and practical solutions: driving their own development.
Whether enabling women and men in Darfur to feed their families, providing people in Bangladesh with the chance to control the impact of flooding on their lives or working with remote communities in Peru to introduce electricity, Practical Action's activities are always people focused, locally relevant and environmentally sensitive, offering tangible ways out of poverty.
Practical Action won The Ashden Award for Light and Power in 2007 for its micro-hydro work in Peru, bringing electricity to over 30,000 people living in remote Andean villages.
*Source material here:
* http://legacy.london.gov.uk/mayor/strategies/sds/docs/regional-flood-risk09.pdf
Saturday, 18 December 2010
You Must Watch This! Hans Rosling's 200 Countries, 200 Years, 4 Minutes - The Joy of Stats
Simply awesome video be sure to ping to your mates!
Friday, 17 December 2010
In-Theater Intelligence? There's An App For That
In-Theater Intelligence? There's An App For That
U.S Army Soldier Equipment To Be Modernized With iPhone and Android Smartphones
With the ever reducing cost and ever increasing capability and durability of smartphone technology, U.S. Army troops could find themselves armed with an iPhone as soon as February 2011, according to a report in USA Today.
The report states that at part of its Connecting Soldiers to Digital Applications (CSDA) program, the U.S Army is working on a soldier technology solution that would mean an iPhone or Android-based smartphone as a standard piece of equipment for every soldier - including picking up the tab on the monthly bill.
"One of the options potentially is to make it a piece of equipment in a soldier's clothing bag," Lt. Gen. Michael Vane, Director of the Army Capabilities Integration Center (ARCIC) told USA Today, adding that most soldiers think this soldier modernization plan is too good to be true.
Mike McCarthy, director of the mission command complex of Future Force Integration Directorate at Fort Bliss, disagrees. McCarthy told USA Today about his vision for a digitally connected army from the ground up.
"What we're doing is fundamentally changing how soldiers access knowledge, information, training content and operational data," McCarthy said. "The day you sign on to be a soldier, you will be accessing information and knowledge in garrison and in an operational environment in a seamless manner. We're using smartphone technologies to lead this."
But it isn't only smartphones. McCarthy explained to USA Today that they were looking at everything from iPads, Kindles, and Nooks, to mini-projectors. Fortunately, the U.S. Army is planning on keeping its options open instead of signing any kind of exclusivity agreement, Rickey Smith, Director of ARCIC-Forward, told USA Today.
"We're not wedded to a specific piece of hardware. We are open to using Palm Trios, the Android, iPhone or whatever else is out there."
The Army plans to begin issuing phones, network equipment and applications to the first Army brigade to be modernized under the brigade combat team modernization program in February. That test will not be limited to smart phones but will include any electronic devices that may be useful to troops.
Realising both the cost and time saving benefits of using 'tried-and-tested' technology, much like its utilization of XBox 360 controllers, the U.S. Army also states it has no plans to develop its own devices, instead opting to make minor tweaks and making the equipment rugged enough for the field.
Of course, unlike the one-time-only purchase of a XBox controller, using mobile communications technology requires an ongoing payment plan, and figuring out a system for purchasing is still in the works. One option being considered is to give soldiers a monthly stipend to spend on minutes, data, and apps, allowing the soldiers to customize their devices to speak to their specific requirements.
"If you did it that way, the advantage would be to pay for the phone once and then you pay a maintenance fee to the soldier ... and then the soldier can buy whatever iPhone, Android or hardware that he or she likes," Vane said. "Then the challenge is just figuring out how we pay for the minutes each month."
But it's not just about paying for extra minutes; the other big question for app-based technology is that of data security. Although testing over classified networks has not been conducted yet, once these issues are addressed, access to these devices would allow soldiers to access real-time geospatial intelligence information and mapping systems in-theater.
Smartphones would enable soldiers access to real-time intelligence and video from overhead unmanned systems, and track friends and enemies on a dynamic map whilst in the battlefield, officials said.
"What we're doing is fundamentally changing how soldiers access knowledge, information, training content and operational data," McCarthy said
According to recent test results, the Army has discovered that the likelihood of soldiers collecting and sharing data radically increases when equipped with smartphones, resulting in more meaningful, up-to-date and actionable data being available, massively improving soldier management both in and away from the battlefield, back in HQ.
Vane said he wants to use the phones to collect biometrics on enemy combatants.
"Can we connect this to biometrics? Well, that's the direction we're headed," he said.
"The challenge will be to work through the policy issues of sharing data and information assurance," Vane continued. "Army officials remain concerned of enemy forces hacking into the phones, but don't want that fear to paralyze the use of these phones."
Vane told USA Today that a widespread battlefield deployment could happen as soon as 2011.
Lt. Gen. Michael Vane will be speaking on this and related topics at Soldier Technology US, 31 Jan - 3 Feb, Arlington, Virginia. To make a reservation or for the full agenda visit www.soldiertechnologyus.com or email us at soldiertechnologyus@wbr.co.uk
U.S Army Soldier Equipment To Be Modernized With iPhone and Android Smartphones
With the ever reducing cost and ever increasing capability and durability of smartphone technology, U.S. Army troops could find themselves armed with an iPhone as soon as February 2011, according to a report in USA Today.
The report states that at part of its Connecting Soldiers to Digital Applications (CSDA) program, the U.S Army is working on a soldier technology solution that would mean an iPhone or Android-based smartphone as a standard piece of equipment for every soldier - including picking up the tab on the monthly bill.
"One of the options potentially is to make it a piece of equipment in a soldier's clothing bag," Lt. Gen. Michael Vane, Director of the Army Capabilities Integration Center (ARCIC) told USA Today, adding that most soldiers think this soldier modernization plan is too good to be true.
Mike McCarthy, director of the mission command complex of Future Force Integration Directorate at Fort Bliss, disagrees. McCarthy told USA Today about his vision for a digitally connected army from the ground up.
"What we're doing is fundamentally changing how soldiers access knowledge, information, training content and operational data," McCarthy said. "The day you sign on to be a soldier, you will be accessing information and knowledge in garrison and in an operational environment in a seamless manner. We're using smartphone technologies to lead this."
But it isn't only smartphones. McCarthy explained to USA Today that they were looking at everything from iPads, Kindles, and Nooks, to mini-projectors. Fortunately, the U.S. Army is planning on keeping its options open instead of signing any kind of exclusivity agreement, Rickey Smith, Director of ARCIC-Forward, told USA Today.
"We're not wedded to a specific piece of hardware. We are open to using Palm Trios, the Android, iPhone or whatever else is out there."
The Army plans to begin issuing phones, network equipment and applications to the first Army brigade to be modernized under the brigade combat team modernization program in February. That test will not be limited to smart phones but will include any electronic devices that may be useful to troops.
Realising both the cost and time saving benefits of using 'tried-and-tested' technology, much like its utilization of XBox 360 controllers, the U.S. Army also states it has no plans to develop its own devices, instead opting to make minor tweaks and making the equipment rugged enough for the field.
Of course, unlike the one-time-only purchase of a XBox controller, using mobile communications technology requires an ongoing payment plan, and figuring out a system for purchasing is still in the works. One option being considered is to give soldiers a monthly stipend to spend on minutes, data, and apps, allowing the soldiers to customize their devices to speak to their specific requirements.
"If you did it that way, the advantage would be to pay for the phone once and then you pay a maintenance fee to the soldier ... and then the soldier can buy whatever iPhone, Android or hardware that he or she likes," Vane said. "Then the challenge is just figuring out how we pay for the minutes each month."
But it's not just about paying for extra minutes; the other big question for app-based technology is that of data security. Although testing over classified networks has not been conducted yet, once these issues are addressed, access to these devices would allow soldiers to access real-time geospatial intelligence information and mapping systems in-theater.
Smartphones would enable soldiers access to real-time intelligence and video from overhead unmanned systems, and track friends and enemies on a dynamic map whilst in the battlefield, officials said.
"What we're doing is fundamentally changing how soldiers access knowledge, information, training content and operational data," McCarthy said
According to recent test results, the Army has discovered that the likelihood of soldiers collecting and sharing data radically increases when equipped with smartphones, resulting in more meaningful, up-to-date and actionable data being available, massively improving soldier management both in and away from the battlefield, back in HQ.
Vane said he wants to use the phones to collect biometrics on enemy combatants.
"Can we connect this to biometrics? Well, that's the direction we're headed," he said.
"The challenge will be to work through the policy issues of sharing data and information assurance," Vane continued. "Army officials remain concerned of enemy forces hacking into the phones, but don't want that fear to paralyze the use of these phones."
Vane told USA Today that a widespread battlefield deployment could happen as soon as 2011.
Lt. Gen. Michael Vane will be speaking on this and related topics at Soldier Technology US, 31 Jan - 3 Feb, Arlington, Virginia. To make a reservation or for the full agenda visit www.soldiertechnologyus.com or email us at soldiertechnologyus@wbr.co.uk
NASA's Global Temperatures. The World Gets Warmer!
The world is getting warmer! Whether the cause is human activity or natural variability—and the preponderance of evidence says it’s likely humans—thermometer readings all around the world have risen steadily since the beginning of the Industrial Revolution. (Click on dates above to step through the decades.
Video is here
http://earthobservatory.nasa.gov/Features/WorldOfChange/decadaltemp.php
According to an ongoing temperature analysis conducted by scientists at NASA’s Goddard Institute for Space Studies (GISS) and shown in this series of maps, the average global temperature on Earth has increased by about 0.8°Celsius (1.4°Fahrenheit) since 1880. Two-thirds of the warming has occurred since 1975, at a rate of roughly 0.15-0.20°C per decade.
But why should we care about one degree of warming? After all, the temperature fluctuates by many degrees every day where we live.
The global temperature record represents an average over the entire surface of the planet. The temperatures we experience locally and in short periods can fluctuate significantly due to predictable cyclical events (night and day, summer and winter) and hard-to-predict wind and precipitation patterns. But the global temperature mainly depends on how much energy the planet receives from the Sun and how much it radiates back into space—quantities that change very little. The amount of energy radiated by the Earth depends significantly on the chemical composition of the atmosphere, particularly the amount of heat-trapping greenhouse gases.
A one-degree global change is significant because it takes a vast amount of heat to warm all the oceans, atmosphere, and land by that much. In the past, a one- to two-degree drop was all it took to plunge the Earth into the Little Ice Age. A five-degree drop was enough to bury a large part of North America under a towering mass of ice 20,000 years ago.
The maps above show temperature anomalies, or changes, not absolute temperature. They depict how much various regions of the world have warmed or cooled when compared with a base period of 1951-1980. (The global mean surface air temperature for that period was estimated to be 14°C (57°F), with an uncertainty of several tenths of a degree.) In other words, the maps show how much warmer or colder a region is compared to the norm for that region from 1951-1980.
The data set begins in 1880 because observations did not have sufficient global coverage prior to that time. The period of 1951-1980 was chosen largely because the U.S. National Weather Service uses a three-decade period to define “normal” or average temperature. The GISS temperature analysis effort began around 1980, so the most recent 30 years was 1951-1980. It is also a period when many of today’s adults grew up, so it is a common reference that many people can remember.
To conduct its analysis, GISS uses publicly available data from 6,300 meteorological stations around the world; ship-based and satellite observations of sea surface temperature; and Antarctic research station measurements. These three data sets are loaded into a computer analysis program—available for public download from the GISS web site—that calculates trends in temperature anomalies relative to the average temperature for the same month during 1951-1980.
The objective, according to GISS scientists, is to provide an estimate of temperature change that could be compared with predictions of global climate change in response to atmospheric carbon dioxide, aerosols, and changes in solar activity.
As the maps show, global warming doesn’t mean temperatures rose everywhere at every time by one degree. Temperatures in a given year or decade might rise 5 degrees in one region and drop 2 degrees in another. Exceptionally cold winters in one region might be followed by exceptionally warm summers. Or a cold winter in one area might be balanced by an extremely warm winter in another part of the globe.
Generally, warming is greater over land than over the oceans because water is slower to absorb and release heat (thermal inertia). Warming may also differ substantially within specific land masses and ocean basins.
In the past decade (2000-2009), land temperature changes are 50 percent greater in the United States than ocean temperature changes; two to three times greater in Eurasia; and three to four times greater in the Arctic and the Antarctic Peninsula. Warming of the ocean surface has been largest over the Arctic Ocean, second largest over the Indian and Western Pacific Oceans, and third largest over most of the Atlantic Ocean.
In the analysis, the years from 1880 to 1950 tend to appear cooler (more blues than reds), growing less cool as we move toward the 1950s. Decades within the base period do not appear particularly warm or cold because they are the standard against which all decades are measured. The leveling off between the 1940s and 1970s may be explained by natural variability and possibly by cooling effects of aerosols generated by the rapid economic growth after World War II.
Fossil fuel use also increased in the post-War era (5 percent per year), boosting greenhouse gases. But aerosol cooling is more immediate, while greenhouse gases accumulate slowly and take much longer to leave the atmosphere. The strong warming trend of the past three decades likely reflects a shift from comparable aerosol and greenhouse gas effects to a predominance of greenhouse gases, as aerosols were curbed by pollution controls, according to GISS director Jim Hansen.
Video is here
http://earthobservatory.nasa.gov/Features/WorldOfChange/decadaltemp.php
According to an ongoing temperature analysis conducted by scientists at NASA’s Goddard Institute for Space Studies (GISS) and shown in this series of maps, the average global temperature on Earth has increased by about 0.8°Celsius (1.4°Fahrenheit) since 1880. Two-thirds of the warming has occurred since 1975, at a rate of roughly 0.15-0.20°C per decade.
But why should we care about one degree of warming? After all, the temperature fluctuates by many degrees every day where we live.
The global temperature record represents an average over the entire surface of the planet. The temperatures we experience locally and in short periods can fluctuate significantly due to predictable cyclical events (night and day, summer and winter) and hard-to-predict wind and precipitation patterns. But the global temperature mainly depends on how much energy the planet receives from the Sun and how much it radiates back into space—quantities that change very little. The amount of energy radiated by the Earth depends significantly on the chemical composition of the atmosphere, particularly the amount of heat-trapping greenhouse gases.
A one-degree global change is significant because it takes a vast amount of heat to warm all the oceans, atmosphere, and land by that much. In the past, a one- to two-degree drop was all it took to plunge the Earth into the Little Ice Age. A five-degree drop was enough to bury a large part of North America under a towering mass of ice 20,000 years ago.
The maps above show temperature anomalies, or changes, not absolute temperature. They depict how much various regions of the world have warmed or cooled when compared with a base period of 1951-1980. (The global mean surface air temperature for that period was estimated to be 14°C (57°F), with an uncertainty of several tenths of a degree.) In other words, the maps show how much warmer or colder a region is compared to the norm for that region from 1951-1980.
The data set begins in 1880 because observations did not have sufficient global coverage prior to that time. The period of 1951-1980 was chosen largely because the U.S. National Weather Service uses a three-decade period to define “normal” or average temperature. The GISS temperature analysis effort began around 1980, so the most recent 30 years was 1951-1980. It is also a period when many of today’s adults grew up, so it is a common reference that many people can remember.
To conduct its analysis, GISS uses publicly available data from 6,300 meteorological stations around the world; ship-based and satellite observations of sea surface temperature; and Antarctic research station measurements. These three data sets are loaded into a computer analysis program—available for public download from the GISS web site—that calculates trends in temperature anomalies relative to the average temperature for the same month during 1951-1980.
The objective, according to GISS scientists, is to provide an estimate of temperature change that could be compared with predictions of global climate change in response to atmospheric carbon dioxide, aerosols, and changes in solar activity.
As the maps show, global warming doesn’t mean temperatures rose everywhere at every time by one degree. Temperatures in a given year or decade might rise 5 degrees in one region and drop 2 degrees in another. Exceptionally cold winters in one region might be followed by exceptionally warm summers. Or a cold winter in one area might be balanced by an extremely warm winter in another part of the globe.
Generally, warming is greater over land than over the oceans because water is slower to absorb and release heat (thermal inertia). Warming may also differ substantially within specific land masses and ocean basins.
In the past decade (2000-2009), land temperature changes are 50 percent greater in the United States than ocean temperature changes; two to three times greater in Eurasia; and three to four times greater in the Arctic and the Antarctic Peninsula. Warming of the ocean surface has been largest over the Arctic Ocean, second largest over the Indian and Western Pacific Oceans, and third largest over most of the Atlantic Ocean.
In the analysis, the years from 1880 to 1950 tend to appear cooler (more blues than reds), growing less cool as we move toward the 1950s. Decades within the base period do not appear particularly warm or cold because they are the standard against which all decades are measured. The leveling off between the 1940s and 1970s may be explained by natural variability and possibly by cooling effects of aerosols generated by the rapid economic growth after World War II.
Fossil fuel use also increased in the post-War era (5 percent per year), boosting greenhouse gases. But aerosol cooling is more immediate, while greenhouse gases accumulate slowly and take much longer to leave the atmosphere. The strong warming trend of the past three decades likely reflects a shift from comparable aerosol and greenhouse gas effects to a predominance of greenhouse gases, as aerosols were curbed by pollution controls, according to GISS director Jim Hansen.
Thursday, 16 December 2010
The Future of GIS Mapping for Emergency Management
When I think GIS mapping, I think ESRI. While there are other companies out there, ESRI pretty much dominates the government market when it comes to computer based maps. I was able to have a conversation with one of their staff today at their booth at the Denver UASI Conference. Here are snippets of what I learned from Paul Christin:
* ESRI is moving to a "one map" look. The idea being to have the maps and the navigation tools all look the same across their platforms to enhance the user's interface. This would be for desktop, enterprise and mobile.
* More applications are coming all the time. He showed me a nifty free one that searches for Twitter-Tweets in a specific geographical area. That would be a great situational tool you could use now when there is an incident.
* He sees 4G providing a little better speed, but the real advantage is multi-tasking on your smart phone/mobile device. You will be able to be on the phone talking, looking at a map, manipulating the data and sending the map to others all at the same time. Nifty!
* I asked which states are heavy into computer/GIS mapping and using the tool in their EOCs and for planning. Answer: Virginia, Florida, Georgia, Texas Public Safety.
* ARC GIS10 will give you your map data over time. So imagine you have basically screen shots of what you knew and when geographically displayed and retrievable. This will be a great documentation tool for after the event when people question the decisions that were made. The map will give you that situation at the time and provide the justification for what you did. A wonderful tool.
I'm convinced that just as in security the future is all about technology, so too in emergency management and homeland security we will see our major advances in the near term come from the integration of technology into what we are doing.
Remember, there is no better display in your EOC than that of an Operational Map generated by computer! I think we are finally getting to the point where we will be able to make the tool "sing and dance."
* ESRI is moving to a "one map" look. The idea being to have the maps and the navigation tools all look the same across their platforms to enhance the user's interface. This would be for desktop, enterprise and mobile.
* More applications are coming all the time. He showed me a nifty free one that searches for Twitter-Tweets in a specific geographical area. That would be a great situational tool you could use now when there is an incident.
* He sees 4G providing a little better speed, but the real advantage is multi-tasking on your smart phone/mobile device. You will be able to be on the phone talking, looking at a map, manipulating the data and sending the map to others all at the same time. Nifty!
* I asked which states are heavy into computer/GIS mapping and using the tool in their EOCs and for planning. Answer: Virginia, Florida, Georgia, Texas Public Safety.
* ARC GIS10 will give you your map data over time. So imagine you have basically screen shots of what you knew and when geographically displayed and retrievable. This will be a great documentation tool for after the event when people question the decisions that were made. The map will give you that situation at the time and provide the justification for what you did. A wonderful tool.
I'm convinced that just as in security the future is all about technology, so too in emergency management and homeland security we will see our major advances in the near term come from the integration of technology into what we are doing.
Remember, there is no better display in your EOC than that of an Operational Map generated by computer! I think we are finally getting to the point where we will be able to make the tool "sing and dance."
Wednesday, 15 December 2010
Geoeye purchases analytics firm
Geoeye is to purchase geospatial predictive analytics company Spadac for US$46m in cash and stock.
Spadac will become a wholly owned subsidiary of Geoeye, renamed Geoeye Analytics.
Spadac provides geospatial predictive analytic solutions to over 40 customers in key markets of defence, intelligence and homeland security. The firm will continue to provide services to these clients, though Spadac’s employees will be absorbed into Geoeye.
Brian O'Toole, Geoeye's chief technology officer, said he hopes the acquisition will help the firm expand its customer base into new markets.
"Spadac is a strong strategic fit; their technology and services will enable us to accelerate our growth in information services,” he said.
“By combining Spadac's predictive analytic solutions with our EyeQ Web platform, we'll be able to offer subscription-based access to a new class of advanced information services."
Spadac's revenues are forecasted to be approximately USD$27m in 2010.
Matt O'Connell, Geoeye's chief executive officer, said the deal will provide end-to-end service.
“We believe that, by combining our imagery collection capabilities with Spadac's location-based analytic solutions, we can help our customers gain unprecedented insight about the areas around the world in which they operate,” he said.
Spadac will become a wholly owned subsidiary of Geoeye, renamed Geoeye Analytics.
Spadac provides geospatial predictive analytic solutions to over 40 customers in key markets of defence, intelligence and homeland security. The firm will continue to provide services to these clients, though Spadac’s employees will be absorbed into Geoeye.
Brian O'Toole, Geoeye's chief technology officer, said he hopes the acquisition will help the firm expand its customer base into new markets.
"Spadac is a strong strategic fit; their technology and services will enable us to accelerate our growth in information services,” he said.
“By combining Spadac's predictive analytic solutions with our EyeQ Web platform, we'll be able to offer subscription-based access to a new class of advanced information services."
Spadac's revenues are forecasted to be approximately USD$27m in 2010.
Matt O'Connell, Geoeye's chief executive officer, said the deal will provide end-to-end service.
“We believe that, by combining our imagery collection capabilities with Spadac's location-based analytic solutions, we can help our customers gain unprecedented insight about the areas around the world in which they operate,” he said.
Tuesday, 14 December 2010
How cool is this? Introducing the Google Latitude app for iPhone
“Where are you?”
Starting today, you’ll never again have to answer (or ask) that question when you’re on the go with your iPhone. With the new Google Latitude app for iPhone, you can see where your friends are and now, continuously share where you are – even in the background once you’ve closed the app.
Since launching last year, Latitude’s focus has always been on one goal: make it simple to stay in touch with friends and family by sharing where you are with each other. Simple setup. Simple sharing without fumbling for your phone. Now, you can use Latitude on your iPhone just like the more than 9 million people actively using it from Android, BlackBerry, Symbian, and Windows Mobile smartphones. Use the app to:
* See where your friends are
* Share your location continuously with whomever you choose
* Contact friends by phone, text message, or email
* Control your location and privacy
You still get simple control over your privacy. Remember, Latitude is 100% opt-in. You must install the app and add friends (or accept requests) to start sharing your location. You can turn off background updating if you’d like and control the same privacy settings: share only city-level location, hide your location, or sign out of Latitude at any time. Learn more in the privacy tips video.
Though we released Latitude as a web application before the iPhone supported third party background applications, today’s Latitude app was built from the ground up using iOS 4’s new multitasking capability to support background updating. You’ll need iOS 4 and above on an iPhone 3GS or iPhone 4 to use the app.
Download Google Latitude now from the App Store in over 15 languages and 45 countries. It will be appearing in the App Store over the next day, but you can also find it directly now. Learn more in the Help Center or ask questions in the Help Forum.
Update (12/13/2010, 10:20am PST): The Google Latitude app will run on the iPhone 3GS, iPhone 4, iPad, and iPod touch (3rd/4th generation). However, background location updating is only supported on the iPhone 3GS, iPhone 4, and iPad 3G. We're continuing to work on expanding support to more devices.
Starting today, you’ll never again have to answer (or ask) that question when you’re on the go with your iPhone. With the new Google Latitude app for iPhone, you can see where your friends are and now, continuously share where you are – even in the background once you’ve closed the app.
Since launching last year, Latitude’s focus has always been on one goal: make it simple to stay in touch with friends and family by sharing where you are with each other. Simple setup. Simple sharing without fumbling for your phone. Now, you can use Latitude on your iPhone just like the more than 9 million people actively using it from Android, BlackBerry, Symbian, and Windows Mobile smartphones. Use the app to:
* See where your friends are
* Share your location continuously with whomever you choose
* Contact friends by phone, text message, or email
* Control your location and privacy
You still get simple control over your privacy. Remember, Latitude is 100% opt-in. You must install the app and add friends (or accept requests) to start sharing your location. You can turn off background updating if you’d like and control the same privacy settings: share only city-level location, hide your location, or sign out of Latitude at any time. Learn more in the privacy tips video.
Though we released Latitude as a web application before the iPhone supported third party background applications, today’s Latitude app was built from the ground up using iOS 4’s new multitasking capability to support background updating. You’ll need iOS 4 and above on an iPhone 3GS or iPhone 4 to use the app.
Download Google Latitude now from the App Store in over 15 languages and 45 countries. It will be appearing in the App Store over the next day, but you can also find it directly now. Learn more in the Help Center or ask questions in the Help Forum.
Update (12/13/2010, 10:20am PST): The Google Latitude app will run on the iPhone 3GS, iPhone 4, iPad, and iPod touch (3rd/4th generation). However, background location updating is only supported on the iPhone 3GS, iPhone 4, and iPad 3G. We're continuing to work on expanding support to more devices.
Monday, 13 December 2010
Minneapolis Metrodome Collapses As Blizzard Dumps 20 Inches Of Snow On Midwest
The inflatable roof of the Minnesota Vikings' stadium collapsed Sunday and roads were closed throughout the upper Midwest as a storm that dumped nearly 2 feet of snow in some areas crawled across the region.
A blizzard warning was in effect for parts of eastern Iowa, southeastern Wisconsin, northwestern Illinois, and northern Michigan, according to the National Weather Service. Surrounding areas including Chicago were under winter storm warnings.
The Metrodome's Teflon roof collapsed after Minneapolis got more than 17 inches of snow. No injuries were reported. The snowfall that ended Saturday night was one of the five biggest in Twin Cities history, National Weather Service meteorologist James McQuirter said. Some surrounding communities got more than 21 inches of snow, he said.
Fox News has dramatic video from inside the Metrodome of the roof collapsing.
A blizzard warning was in effect for parts of eastern Iowa, southeastern Wisconsin, northwestern Illinois, and northern Michigan, according to the National Weather Service. Surrounding areas including Chicago were under winter storm warnings.
The Metrodome's Teflon roof collapsed after Minneapolis got more than 17 inches of snow. No injuries were reported. The snowfall that ended Saturday night was one of the five biggest in Twin Cities history, National Weather Service meteorologist James McQuirter said. Some surrounding communities got more than 21 inches of snow, he said.
Fox News has dramatic video from inside the Metrodome of the roof collapsing.
Sunday, 12 December 2010
UK's NHS Launches HealthGIS Maps
The UK’s National Health Service (NHS) AIMTC (Avon Information Management and Technology Consortium) HealthGIS has launched a new intranet mapping service to Primary Care Trust (PCT) clients in Avon, using Cadcorp GeognoSIS and Cadcorp SIS MapModeller. This new service - known as ‘HealthGIS Maps’ - has successfully delivered against its first project milestone, providing Pharmaceutical Needs Assessments for Avon PCTs.
Each PCT in Avon is currently charged with reviewing accessibility to the pharmacy services that it manages. Each Pharmaceutical Needs Assessment requires map-based evidence about access to pharmacies, viewed against demographic and socio-economic indicators. Since PCTs rarely have the necessary expertise in managing geographic information, they are being assisted by a group of GIS experts from the AIMTC HealthGIS team.
The delivery of the Pharmaceutical Needs Assessment maps via HealthGIS Maps marks a significant development in the use of geographic information in the NHS Primary Care sector, as Matthew Leaver, from ‘HealthGIS’ explains.
‘Traditionally all maps have been delivered as PDF or JPEG static media. The deployment of Cadcorp GeognoSIS in HealthGIS Maps means that we can now deliver all content to health care and medicines managers in an easy to use web browser environment. We are confident that this will promote a culture of information dissemination, and lead to efficiency savings through self-service.’
HealthGIS Maps is a service available to all NHS staff with access to the NHS network. It is anticipated that easy access to geographic information, coupled with the ability for end users to create their own bespoke maps, will provide both tangible and intangible cost savings for NHS staff who use geographic information for resource planning, service development and performance monitoring.
HealthGIS plans to achieve further cost savings for their PCT colleagues by using Cadcorp GeognoSIS to enhance legacy IT systems with new geographic capabilities.
‘The government is committed to an evidence-based public health service in the UK,’ noted Mike O’Neil, CEO of Cadcorp, ‘and much of that evidence will be geographically based. Cadcorp is proud to be providing AIMTC HealthGIS with the ability to share that geographic information more widely.’
Each PCT in Avon is currently charged with reviewing accessibility to the pharmacy services that it manages. Each Pharmaceutical Needs Assessment requires map-based evidence about access to pharmacies, viewed against demographic and socio-economic indicators. Since PCTs rarely have the necessary expertise in managing geographic information, they are being assisted by a group of GIS experts from the AIMTC HealthGIS team.
The delivery of the Pharmaceutical Needs Assessment maps via HealthGIS Maps marks a significant development in the use of geographic information in the NHS Primary Care sector, as Matthew Leaver, from ‘HealthGIS’ explains.
‘Traditionally all maps have been delivered as PDF or JPEG static media. The deployment of Cadcorp GeognoSIS in HealthGIS Maps means that we can now deliver all content to health care and medicines managers in an easy to use web browser environment. We are confident that this will promote a culture of information dissemination, and lead to efficiency savings through self-service.’
HealthGIS Maps is a service available to all NHS staff with access to the NHS network. It is anticipated that easy access to geographic information, coupled with the ability for end users to create their own bespoke maps, will provide both tangible and intangible cost savings for NHS staff who use geographic information for resource planning, service development and performance monitoring.
HealthGIS plans to achieve further cost savings for their PCT colleagues by using Cadcorp GeognoSIS to enhance legacy IT systems with new geographic capabilities.
‘The government is committed to an evidence-based public health service in the UK,’ noted Mike O’Neil, CEO of Cadcorp, ‘and much of that evidence will be geographically based. Cadcorp is proud to be providing AIMTC HealthGIS with the ability to share that geographic information more widely.’
Saturday, 11 December 2010
SuperGIS Applied to Recreation Potential Analysis Model
SuperGeo integrates GIS technologies with an analysis model for recreation potential analysis considering various resources and environmental protection. The related sectors can apply the research outcome when choosing suitable spatial range for potential natural and recreation sites by using SuperGIS.
To strike a balance between conserving natural environment and resources, as well as satisfying the public’s need for recreation is always a challenge. In order to find a solution, SuperGeo develops a project to assist the related authorities in choosing the potential natural and recreation sites.
The project aims to do the related analysis and integration comparisons of estimated values of diverse resources through the establishment of Recreation Potential Analysis Model for Natural Recreation Site. Accurate and rigorous data can then be produced as important references to select the most suitable recreation site.
The project employs SuperGIS Desktop and SuperGIS Spatial Statistical Analyst, the extension of SuperGIS Desktop, to assist the related staff in doing the statistics and estimation of natural recreational resources, landscape resources, and creature resources within the study area. On the contour maps, the spatial distribution of each estimated value can be displayed. Therefore, the staff can clearly find the spatial range suitable for the recreation area.
The extent of the project covers Taiwan Zhong-gan river mainstream and its three branches. The staff selects three to five sampling sites at each river area for the recreation potential survey.
During the modeling process, the planners should respectively transform the obtained estimated values into grey relational coefficients of creature data, recreation potential, and landscape evaluation. The planners can then use these data for advanced analysis. After the estimated maps are completed, the proper area for recreation area plans appears.
SuperGIS software helps the related authorities calculate the estimated values objectively and precisely. The researchers can clearly understand the spatial distribution of each estimated value by mapping. The results are also important references when the government units develop recreation sites in the future.
To strike a balance between conserving natural environment and resources, as well as satisfying the public’s need for recreation is always a challenge. In order to find a solution, SuperGeo develops a project to assist the related authorities in choosing the potential natural and recreation sites.
The project aims to do the related analysis and integration comparisons of estimated values of diverse resources through the establishment of Recreation Potential Analysis Model for Natural Recreation Site. Accurate and rigorous data can then be produced as important references to select the most suitable recreation site.
The project employs SuperGIS Desktop and SuperGIS Spatial Statistical Analyst, the extension of SuperGIS Desktop, to assist the related staff in doing the statistics and estimation of natural recreational resources, landscape resources, and creature resources within the study area. On the contour maps, the spatial distribution of each estimated value can be displayed. Therefore, the staff can clearly find the spatial range suitable for the recreation area.
The extent of the project covers Taiwan Zhong-gan river mainstream and its three branches. The staff selects three to five sampling sites at each river area for the recreation potential survey.
During the modeling process, the planners should respectively transform the obtained estimated values into grey relational coefficients of creature data, recreation potential, and landscape evaluation. The planners can then use these data for advanced analysis. After the estimated maps are completed, the proper area for recreation area plans appears.
SuperGIS software helps the related authorities calculate the estimated values objectively and precisely. The researchers can clearly understand the spatial distribution of each estimated value by mapping. The results are also important references when the government units develop recreation sites in the future.
Thursday, 9 December 2010
Tim Berners-Lee: The year open data went worldwide
I had not seen this, glad I have now as it makes for good viewing, sorry if I am regurgitating old stuff.
At TED2009, Tim Berners-Lee called for "raw data now" -- for governments, scientists and institutions to make their data openly available on the web. At TED University in 2010, he shows a few of the interesting results when the data gets linked up.
At TED2009, Tim Berners-Lee called for "raw data now" -- for governments, scientists and institutions to make their data openly available on the web. At TED University in 2010, he shows a few of the interesting results when the data gets linked up.
ActiveMap - Permit & Resource Management Department
ActiveMap
GIS HomeActiveMap is an Interactive Mapping Application that enables the public to view General Plan Land Use, Zoning and many other land development related data sets for property within unincorporated Sonoma County over the Internet. Within the mapping application, there are three maps to choose between: Base, Zoning and General Plan. Each map provides different types of geographic data used in land use planning and land development. Users will be able to research many PRMD data sets for properties within unincorporated Sonoma County in a flexible, easy-to-use map-like product.
- Use of the Interactive Mapping Application is subject to the Terms & Conditions of Use.
- Upon selecting & opening an Interactive Map below, user may toggle between maps.
Base Map | This map provides a geographic and visual context to general information for the unincorporated areas of Sonoma County. The layers of information displayed represent PRMD’s base map data. The map’s layers of information include: | |
Air Quality Control Board Area and Specific Plans City Limits City Sphere of Influence City Urban Growth Fire Protection Responsibility Area Flood Prone Urban Area Geographic Places Lake Sonoma Local Area Development Guidelines Parcels | Photos - Color Aerial Planning Areas Redevelopment Plan Supervisor Districts Urban Service Areas USGS Streams Waiver Prohibition Areas Water Quality Control Board Wet Weather Zones Williamson Act Land Contracts |
Zoning | This map combines PRMD base map data with map layers that provide specific zoning information related to planning and development within the unincorporated areas of Sonoma County. The map’s layers of information include: | |
Affordable Housing Biotic Resource City Limits Floodplain Floodway Geographic Places Geologic Hazard Historic District Land Use Policy (General Plan) | Lake Sonoma Mineral Resource Parcels Photos - Color Aerial Scenic Design Scenic Resource Urban Service Areas Valley Oak Habitat Zoning by Area |
General Plan | This map combines PRMD base map data with map layers that provide specific General Plan Land Use (GP LU) and Open Space (GP OS) information related to planning and development within the unincorporated areas of Sonoma County. The map’s layers of information include: | |
City Limits Geographic Places GP LU by Area GP Planning Area Policies GP OS Community Separators GP OS Existing Parks GP OS Future Parks GP OS Habitat Connectivity Corridors GP OS Marshes and Wetlands | GP OS Riparian Corridors (USGS Streams) GP OS Scenic Corridors GP OS Scenic Landscape Units Lake Sonoma Parcels Photos - Color Aerial Planning Areas Urban Service Areas |
System Recommendations
- Screen Resolution
To view the graphic details of the maps, it is recommended to use a monitor with a screen resolution of 1024 x 768 or higher. - Pop-ups
This site creates pop-up windows. Please enable pop-ups by turning off Pop-up Blocker. - Adobe Acrobat PDF Files
Many of the documents on the websites are in Adobe Acrobat Portable Document Format (PDF). PDF format is used to preserve the content and layout of our hard copy publications. Publications in PDF can only be viewed and printed using the Adobe Acrobat Reader version 4.0 or higher. The Adobe Acrobat Reader can be downloaded for FREE at the Adobe Systems, Inc. Site, where help using the Reader is also provided.
Accessibility Statement
- The Sonoma County PRMD is committed to ensuring the interactive mapping applications are accessible to all users. The website undergoes review and redesign as necessary to ensure that they meet and/or exceed the requirements of Section 508 of the Rehabilitation Act of 1973.
- Should gaining access to information on these websites be hindered, please e-mail ActiveMap’s webmaster and assistance shall be provided to meet your needs.
Contact Us
- The PRMD geographic information systems (GIS) staff serves the needs of County staff, Local, State and Federal government and the citizens and businesses of the County of Sonoma. Utilizing GIS technology, information regarding geographical elements (e.g. land, river, streets) and their relationships to one another can be visually displayed (for review) and analyzed.
- The GIS section is the nucleus of PRMD’s network, managing and administrating land use datasets, software and technology. The ultimate goal of PRMD’s GIS staff is to implement innovative ideas to develop, maintain and provide accurate geographical information in an effort to facilitate land development planning and determinations, while fostering efficient business practices and better decision making.
- Additionally, our GIS products and services aid PRMD’s staff decision and policymaking processes, and seek to improve the livelihood of the citizens of Sonoma County.
- Should you have questions, inquiries and/or comments, please e-mail ActiveMap’s webmaster.
Wednesday, 8 December 2010
Hackers Defend WikiLeaks by Attacking PayPal and PostFinance
A group of hackers connected to the online imageboard 4chan, often referred to as Anonymous, have retaliated against several sites that denied service to WikiLeaks shortly after the site started releasing secret embassy cables.The site of Swiss bank PostFinance, which has closed the account of WikiLeaks () founder Julian Assange, has been taken down and is still unavailable at the time of this writing. Hackers have also attacked PayPal but have only managed to take down the site’s blog (), while the service remained operational.
A spokesman for the group behind the attacks on PayPal and PostFinance said they will target any website that’s “bowing down to government pressure.” The same group is allegedly behind the series of attacks collectively called “Operation: Payback,” which targeted anti-piracy organizations such as RIAA and MPAA.
Among other companies that have denied service to WikiLeaks are DNS service provider EveryDNS.net and Amazon (). Most of these sites claim they haven’t shut down WikiLeaks’ account due to political pressure, instead naming technical or procedural reasons for denying the service to the WikiLeaks.
Update: Mastercard, which has also denied service to WikiLeaks, has had its site taken down by hackers, too.
A spokesman for the group behind the attacks on PayPal and PostFinance said they will target any website that’s “bowing down to government pressure.” The same group is allegedly behind the series of attacks collectively called “Operation: Payback,” which targeted anti-piracy organizations such as RIAA and MPAA.
Among other companies that have denied service to WikiLeaks are DNS service provider EveryDNS.net and Amazon (). Most of these sites claim they haven’t shut down WikiLeaks’ account due to political pressure, instead naming technical or procedural reasons for denying the service to the WikiLeaks.
Update: Mastercard, which has also denied service to WikiLeaks, has had its site taken down by hackers, too.
How the World Uses Cellphones
More people in the United Arab Emirates (UAE) own cellphones than those in the US, according to this infographic designed by Wilson Electronics.
The infographic, titled The Shocking Demographics of Cellphone Use, dispenses a few lesser-known facts about those gadgets we use on a daily basis. For example, while the average person in the UAE owns 1.95 phones, the average American only owns 0.87 phones.
Some other strange facts: 15% of Americans answer phone calls during sex, and 10% of those under 25 think that it’s acceptable to text while having sex.
In total, Americans use approximately 6.1 million minutes per day talking on their cells—this equates to about 21 minutes per person. And texting is also on the rise—in 2009, a total of 4.1 billion texts were sent in the US, compared to 5 billion this year.
Amazing Security Video of Cruise Ship in a Storm
www.qwackers.com click on Cruise Ship Horror on right hand side.
Map of Wikileaks list of facilities 'vital to US security'
The first step was to take the rather messy data and identify individual entries. In some cases it was no more specific than "Indonesia: Tin Mine and Plant". In other cases it named a pipeline, a port, or a city in which an undersea cable made landfall. Next, using the worldatlas.com geocoder (as well as some Wikipedia entries), we (along with the much appreciated help of Zach U. and Tim B.) located an approximate latitude and longitude for each of the locations mentioned in the cables.
We wish to emphasize that the locations in our mashup are only for the cities in which these critical facilities are located, and not the actual facilities themselves. In some cases, the location in the map is no more detailed than the country. Given this relative inaccuracy, this map does not present any security threat whatsoever. Moreover, all the data sets used for this geo-coding are openly available on the Internet and could easily be replicated by anyone.
Our purpose is to visualize the patterns exhibited by this particular data set, which are illustrated below. The categories in the legend are our own classifications based on the information provided by Wikileaks (you can view a larger, non-embedded version of our mashup here, or download a KMZ file of the mashup here that should automatically load into Google Earth. The KMZ version also allows you to turn on and off categories as you wish).
Map Legend
It is interesting to note that the vast majority of these facilities are not directly military-related. Even the ones that we mark as 'military' are related industrial facilities rather than actual bases. Instead, the list seems to focus on non-military topics such as telecommunications, energy and pharmaceuticals. Much of the list is also focused on supplies of important raw materials (Bauxite, Chromite, and Rare Earth Minerals), as well as the ability to move products through ports and shipping channels.
Share of Facilities by Type
Telecommunications | 28% |
Energy | 18% |
Pharmaceuticals | 13% |
Border_Crossing | 11% |
Raw_Material | 10% |
Port | 7% |
Military | 5% |
Industrial | 4% |
Shipping | 3% |
Dam | 2% |
These data offer a fascinating insight into the ways that the national security priorities of the United States span the entire globe. This global web of essential facilities goes a long way to explain the fact that the US Department of Defense has more military facilities around the world than all other nations combined. The globalization of the world economy means that facilities that are vital to the communication, health, and economic needs of the U.S. are scattered across the planet; and this ultimately means that the U.S. (as well as other developed and developing countries) have to contend with new and changing notions of what "security" means in the 21st century.
We are truly living in a network society.
Tuesday, 7 December 2010
Geospatial Information Systems Market to Experience Vibrant Growth
The worldwide market for Geospatial Information Systems (GIS) is forecast to grow 65 percent over the next five years, representing a compound annual growth rate of 10.5 percent, according to a new ARC Advisory Group study. With the global economic downturn now predominantly in the past, capital spending on information technology has rebounded vigorously. The GIS market has taken part in this technology rebound and is expected to experience vibrant growth.
“Traditionally strong GIS market segments such as electric power, oil and gas distribution, and divisions of federal governments continue to expand their use of GIS solutions. Meanwhile, more contemporary segments such as insurance, real estate, and retail are expected to increase GIS adoption along with the global economic rebound,” according to Clint Reiser, Enterprise Software Analyst, and the principal author of ARC’s “Geospatial Information Systems Worldwide Outlook”.
Government Policy Drives GIS SalesA large percentage of total GIS sales is to the government and utility sectors. This characteristic makes the GIS market sensitive to changes in government policy. The Smart Grid initiative in the US, the European 20/20/20 initiative, and India’s power development program are currently stimulating growth in GIS sales to the electric power industry. The US Pipeline Integrity, Protection, Enforcement, and Safety Act is expected to drive GIS sales to the oil &gas distribution industry as suppliers employ GIS to support risk-based integrity management programs and improvements in asset reporting capabilities. At the same time, government spending at the federal level on infrastructure and technology has been initiated in many regions as means of stabilizing the economy, supporting future growth, and maintaining current programs. Some of this spending is expected to generate GIS sales to government agencies.
Modern IT Reshapes the GIS MarketMobile solutions, web services, and cloud computing have become influential factors within the GIS industry. Mobile GIS has become more prevalent and shows promise for increased adoption in network design and maintenance functions. The software-as-a-service business model is now well established as the delivery mechanism for geospatial business analysis solutions. Meanwhile, the improvements in connectivity provided by cloud computing have also brought about an influx of GIS products that integrate imaging and GIS data with time-based information such as traffic reports to create mash-ups. Improvements in connectivity are also providing increased visibility into GIS data sets stored within organizations, thereby decreasing duplicate purchases and unnecessary costs. The continuing adoption of mobile GIS, the SaaS delivery model, and other forms of cloud computing have great potential to transform and expand segments of the GIS market in the near future.
Emerging Markets Primed for GrowthEmerging markets such as China, India, and Brazil continue to invest in infrastructure for electric power distribution, water & wastewater, and telecommunications. GIS sales to these regions will increase as the local utilities adopt GIS to improve their infrastructure management processes.
Monday, 6 December 2010
Loqate Launches Worldwide Geocoding Solution
Loqate, the specialist in global location data announces the launch of a new worldwide geocoding software solution. Covering over 240 countries, Loqate provides rooftop, street, city, or postal code level geocoding for almost any address or location around the world. “This new solution is revolutionary in two ways,” comments President/CEO of Loqate Inc, Martin Turvey. “ Our advanced parsing and address verificaton engine enables us to obtain geocode matches on even the very worst data, including non-Roman alphabet based languages. Secondly, the worldwide coverage at the level of granularity provided by the solution, all through a single interface, is completely unique.” “Geocoding is playing an increasingly important part in many businesses”, adds Loqate’s VP of Sales, Liat Perlman. “From traditional data quality markets, through freight and transportation, GIS and mapping, insurance risk assessment and more recently, the massive growth of Location Based Services, being able to pinpoint an accurate location of an address is core to so many business processes.” The Loqate Geo-data Engine is available as a developer’s toolkit for integration into a wide variety of applications and business processes and is designed to run on most operating system platforms. Loqate’s solution will soon be available in a Software as a Service model. About Loqate Loqate provides international addressing and geo-location software solutions, enabling clients to make better use of their data in areas such as data quality, direct marketing, fraud detection, insurance risk assessment, transaction monitoring, logistics & supply chain, online payment processing, as well as GIS, mapping and location based services. Using a combination of advanced algorithmic analysis and comprehensive reference sources, Loqate can identify, verify, correct and enrich geographic related client data, adding valuable information to increase its value. Loqate solutions are global, covering over 240 countries around the world - every populated world territory. Loqate is based in Redwood City, CA, and operates through a number of strategic partnerships throughout the world. www.loqate.com |
Saturday, 4 December 2010
Earth from Space: Eye of Africa
A stunning Envisat image capturing spectacular geological phenomena – the ‘Eye of Africa’ (right) and a magnetic mountain (centre) – in the Sahara Desert of Mauritania in northern Africa.
Peeping out from a sea of golden sand, the remarkable circular Richat structure resembles an eye from space. Once thought to be formed by a meteor impact, it is now believed to be the result of geological uplift exposed by wind and water erosion.
Different rates of erosion on the varying rock types have formed concentric ridges; the more erosion-resistant rocks form high ridges (blue and purple), while the non-resistant rocks form valleys (yellow).
The surrounding dark area forms a plateau of sedimentary rock standing some 200 m above the surrounding desert sands, with the peak of the outer rim some 485 m above sea level. Sand is encroaching into the Richat structure’s southern side.
Mauritania’s highest peak (nearly 1000 m), the Kediet ej Jill Mountain, is visible northwest of Richat. It appears bluish because it is made completely of magnetite, a natural magnet. Owing to its inherent magnetic properties, the mountain disrupts navigational compasses.
Western Sahara is visible along the top and left.
This image was acquired by Envisat’s Medium Resolution Imaging Spectrometer on 1 November at a resolution of 300 m
A close-up of the Richat structure in the Sahara Desert of Mauritania in northern Africa. Credits: ESA
Friday, 3 December 2010
Google Launches Earth Engine for Carbon Accounting
Click here for a nice video:
http://www.youtube.com/watch?v=MnCf9Gjz720
As promised last year at the COP15 climate talks in Copenhagen, the Google Earth Engine launched today at the COP16 Climate Change Conference in Cancun. Earth Engine is a dynamic digital model of our planet that is updated daily, with 25 years of satellite imagery from the Landsat imagery catalog and tools to analyze the imagery. The tools were made available to rainforest nations in advance of this week’s climate talks as both a means of assessing carbon in their forests and as a testbed for system development.
Earth Engine is not only a core platform for hosting imagery and other geospatial data, it is also an analysis processing framework with an Earth Engine API for scientists to write their own code to customize the platform for their area of interest. Google is also providing 20 million CPU hours for free to scientists and developing countries to take advantage of the new analytical tool.
Earth Engine has been a collaborative effort that has included contributions from Greg Asner of the Carnegie Institution for Science, Carlos Souza of Imazon and Matt Hansen of the Geographic Information Science Center at South Dakota State University.
It is hoped that this year’s talks will result in an agreement on the U.N. initiative known as Reducing Emissions from Deforestation and Forest Degradation in Developing Countries (REDD), and this tool will act as a means to monitor and enforce such an agreement, tracking environmental change and deforestation rates in real time.
http://www.youtube.com/watch?v=MnCf9Gjz720
As promised last year at the COP15 climate talks in Copenhagen, the Google Earth Engine launched today at the COP16 Climate Change Conference in Cancun. Earth Engine is a dynamic digital model of our planet that is updated daily, with 25 years of satellite imagery from the Landsat imagery catalog and tools to analyze the imagery. The tools were made available to rainforest nations in advance of this week’s climate talks as both a means of assessing carbon in their forests and as a testbed for system development.
Earth Engine is not only a core platform for hosting imagery and other geospatial data, it is also an analysis processing framework with an Earth Engine API for scientists to write their own code to customize the platform for their area of interest. Google is also providing 20 million CPU hours for free to scientists and developing countries to take advantage of the new analytical tool.
Earth Engine has been a collaborative effort that has included contributions from Greg Asner of the Carnegie Institution for Science, Carlos Souza of Imazon and Matt Hansen of the Geographic Information Science Center at South Dakota State University.
It is hoped that this year’s talks will result in an agreement on the U.N. initiative known as Reducing Emissions from Deforestation and Forest Degradation in Developing Countries (REDD), and this tool will act as a means to monitor and enforce such an agreement, tracking environmental change and deforestation rates in real time.
Thursday, 2 December 2010
GIS on a Stick
GIS on a USB stick. Please if you have any comments post download pop them in the comments area as it would be great to have some decent impartial reviews.
Announcement: Portable GIS Version 2 is released! Download it here
The philosophy behind this idea was to provide beginners with a ready-installed and configured stack of open source GIS tools that would run in windows without the need for emulation or a live cd. By taking out the often difficult installation and configuration, I hope to make it easier for beginners to get started with open source GIS, so they are not put off before it gets interesting and fun. Not only that, but having a fully self-contained GIS system may prove useful in a number of real-life situations.
Newly updated version 2 contains a self-contained installer, updated versions of all the constituent software packages, a new control panel, and improved documentation.
The current set of software includes:
Provisoes:
Announcement: Portable GIS Version 2 is released! Download it here
The philosophy behind this idea was to provide beginners with a ready-installed and configured stack of open source GIS tools that would run in windows without the need for emulation or a live cd. By taking out the often difficult installation and configuration, I hope to make it easier for beginners to get started with open source GIS, so they are not put off before it gets interesting and fun. Not only that, but having a fully self-contained GIS system may prove useful in a number of real-life situations.
Newly updated version 2 contains a self-contained installer, updated versions of all the constituent software packages, a new control panel, and improved documentation.
The current set of software includes:
- Desktop GIS packages QGIS (with GRASS plugin), uDIG and gvSIG,
- FWTools (GDAL and OGR toolkit)
- XAMPPlite (Apache2/MySQL5/Php5),
- PostgreSQL (version 8.4)/Postgis (version 1.4),
- Mapserver, OpenLayers, Tilecache, Featureserver, and Geoserver web applications.
Provisoes:
- This is not for production use. In order to keep end user configuration to a minimum there are a number of security holes and as such it should be used for demonstration and home use only.
- It is also not “stealth GIS”- no attempt has been made to leave no trace on the host system.
- Launchpad site (with bug tracker and mailing list)
- Portable GIS Google Group
Wednesday, 1 December 2010
Awesome New Tool, Needlebase.
Needlebase allows you to view web pages through a virtual browser, point and click to train it in understanding what fields on that page are of interest to you and how those fields relate to each other. Then the program goes and scrapes the data from all of those fields, publishes them into a table, list or map, and recommends merges of cells that appear to be mistakenly separate. It's very cool and it lets non-technical people do things with data quickly and easily that we used to require the assistance of someone more technical to do.
Video is here:
http://www.youtube.com/watch?v=58Gzlq4zSDk&feature=player_embedded
We used Needlebase to look at all the tweets from people on the Twitter list of Twitter staff members and extract the username, message body and location, if exposed. Needlebase scraped the last 1500 Tweets in less than 5 minutes. We displayed them on a map and saw that there was just one Tweet published in that time from Utah: a Twitter Site Operations Technician who had just left San Francisco to move to Salt Lake City, complaining about Qwest router problems. That wasn't quite confirmation, but it sure felt like a valuable clue and was very easy to come by thanks to Needlebase.
Wouldn't it be great to extract that data over time, to track it and to turn it into blog posts? I think it would. I couldn't figure out how to get all the data out that I wanted though.
Enter Needlebase. Last night I pointed Needle to my Postrank pages for geotech blogs and in minutes it pulled down all the data I wanted. I exported that data as a CSV, uploaded it to Google Docs as a spreadsheet, did a little subtraction and now have the following chart tracking the top 300 geotech blogs on the web. Now in my handy spreadsheet, I was able to set up a function to show me which blogs jumped or fell in the rankings the most over the previous week. Thanks, Needlebase!Event Preparation
I've written here about how to use Mechanical Turk to get ready and rock an industry event. Needlebase can prove useful for that as well.
One the other end of the spectrum is the brand-new Extractiv, a bulk web-crawling and semantic analysis tool that's also remarkably easy to use. Earlier this month I used Extractiv to search across 300 top geotech blogs for all instances of the word "ESRI," all entities mentioned in relation to ESRI and the words used to describe those relations. The service processed 125,000 pages and spit out my results in less than an hour for less than a dollar. That's incredible - it's a game changer.
Needlebase is too. It sits somewhere in between Dapper and Extractiv, I think. These tools are democratizing the ability to extract and work with data from across the web. They are to text processing what blogging was to text publishing.
Video is here:
http://www.youtube.com/watch?v=58Gzlq4zSDk&feature=player_embedded
Investigative journalism
Last month a local newspaper reported that a big new data center had opened in Salt Lake City with a mystery anchor client. The paper believed the client was Twitter, as the company has said it was going to open its first off-site data center in Utah at an undisclosed date.We used Needlebase to look at all the tweets from people on the Twitter list of Twitter staff members and extract the username, message body and location, if exposed. Needlebase scraped the last 1500 Tweets in less than 5 minutes. We displayed them on a map and saw that there was just one Tweet published in that time from Utah: a Twitter Site Operations Technician who had just left San Francisco to move to Salt Lake City, complaining about Qwest router problems. That wasn't quite confirmation, but it sure felt like a valuable clue and was very easy to come by thanks to Needlebase.
Data Re-Sorting
Last night I found a solution to a long-running issue I've been struggling with. I've got this list of 300 blogs around the web that cover geotechnology (that's a whole other story) and have them all run through Postrank. That service ranks them in order of most to least social media and reader engagement per blog post.Wouldn't it be great to extract that data over time, to track it and to turn it into blog posts? I think it would. I couldn't figure out how to get all the data out that I wanted though.
Enter Needlebase. Last night I pointed Needle to my Postrank pages for geotech blogs and in minutes it pulled down all the data I wanted. I exported that data as a CSV, uploaded it to Google Docs as a spreadsheet, did a little subtraction and now have the following chart tracking the top 300 geotech blogs on the web. Now in my handy spreadsheet, I was able to set up a function to show me which blogs jumped or fell in the rankings the most over the previous week. Thanks, Needlebase!Event Preparation
I've written here about how to use Mechanical Turk to get ready and rock an industry event. Needlebase can prove useful for that as well.
The DIY Data Hackers Toolkit
I put Needle in my mind in between two other wonderful tools. On one end of the spectrum is the now Yahoo-acquired Dapper, which anyone can use to build an RSS feed from changes made to any field on any web page. (See: The Glory and Bliss of Screen Scraping and How Yahoo's Latest Acquisition Stole and Broke My Heart)One the other end of the spectrum is the brand-new Extractiv, a bulk web-crawling and semantic analysis tool that's also remarkably easy to use. Earlier this month I used Extractiv to search across 300 top geotech blogs for all instances of the word "ESRI," all entities mentioned in relation to ESRI and the words used to describe those relations. The service processed 125,000 pages and spit out my results in less than an hour for less than a dollar. That's incredible - it's a game changer.
Needlebase is too. It sits somewhere in between Dapper and Extractiv, I think. These tools are democratizing the ability to extract and work with data from across the web. They are to text processing what blogging was to text publishing.
Tuesday, 30 November 2010
Has NASA discovered Alien Life????
Has NASA discovered extraterrestrial life?
Here's a curious press release from NASA:NASA will hold a news conference at 2 p.m. EST on Thursday, Dec. 2, to discuss an astrobiology finding that will impact the search for evidence of extraterrestrial life. Astrobiology is the study of the origin, evolution, distribution and future of life in the universe.I did a little research on the news conference participants and found:
1. Pamela Conrad (a geobiologist) was the primary author of a 2009 paper on geology and life on Mars
2. Felisa Wolfe-Simon (an oceanographer) has written extensively on photosynthesis using arsenic recently (she worked on the team mentioned in this article)
3. Steven Benner (a biologist) is on the "Titan Team" at the Jet Propulsion Laboratory; they're looking at Titan (Saturn's largest moon) as an early-Earth-like chemical environment. This is likely related to the Cassini mission.
4. James Elser (an ecologist) is involved with a NASA-funded astrobiology program called Follow the Elements, which emphasizes looking at the chemistry of environments where life evolves (and not just looking at water or carbon or oxygen).
So, if I had to guess at what NASA is going to reveal on Thursday, I'd say that they've discovered arsenic on Titan and maybe even detected chemical evidence of bacteria utilizing it for photosynthesis (by following the elements). Or something like that. (thx, sippey)
By Jason Kottke
Google Earth 6 Improves Street View, Historical Imagery, and Adds (Millions of) 3D Trees
Google Earth has always had an incredible "wow" factor. But the newly-released Google Earth 6, in Google's own words, takes "realism in the virtual globe to the next level." This version adds two new features, an integrated Street View experience and 3D trees, and also makes it easier to browse historical imagery associated with a specific location.
Google Earth provides a wealth of computer-generated building models, but Google notes that trees have been "rather hard to come by." In the service of boosting the realism substantially of the 3D world substantially, today's Google Earth release includes models for dozens of species of trees. Google says it's already "planted" over 80 million trees in Google Earth.
Street View isn't a new feature for Google Earth. It's been available since 2008. But the experience is now fully integrated, so you can zoom from the outer space view of Earth smoothly and seamlessly to your doorstep. Simply drag Pegman, the Street View mascot, onto any place where you see a blue highlighted road, an indication that Street View is available. And from there you can use the navigation controls to move around.
Like Street View, the availability of historical imagery via Google Earth isn't entirely new. But this release makes these images far easier to find. When you fly to an area where images are available, the date of the oldest imagery will appear in the status bar. Clicking on it will transport you to that view, and you can browse through other images for that location.
Click link for demo
http://www.readwriteweb.com/archives/google_earth_6_improves_street_view_historical_ima.php
Google Earth provides a wealth of computer-generated building models, but Google notes that trees have been "rather hard to come by." In the service of boosting the realism substantially of the 3D world substantially, today's Google Earth release includes models for dozens of species of trees. Google says it's already "planted" over 80 million trees in Google Earth.
A word from our sponsor:
With the launch of the Alcatel-Lucent Developer Platform, Alcatel-Lucent provides service providers and enterprises with tools that enable third-party developers to build, test, manage and distribute applications across networks, including television, broadband Internet and mobile.Street View isn't a new feature for Google Earth. It's been available since 2008. But the experience is now fully integrated, so you can zoom from the outer space view of Earth smoothly and seamlessly to your doorstep. Simply drag Pegman, the Street View mascot, onto any place where you see a blue highlighted road, an indication that Street View is available. And from there you can use the navigation controls to move around.
Like Street View, the availability of historical imagery via Google Earth isn't entirely new. But this release makes these images far easier to find. When you fly to an area where images are available, the date of the oldest imagery will appear in the status bar. Clicking on it will transport you to that view, and you can browse through other images for that location.
Click link for demo
http://www.readwriteweb.com/archives/google_earth_6_improves_street_view_historical_ima.php
Monday, 29 November 2010
Saturday, 27 November 2010
Friday, 26 November 2010
Tha Father of GIS
Roger Tomlinson changed the face of geography as a discipline when he introduced geographic information system technology (GIS) in the late 1960s, which scans maps into a computer and allows data built into those maps to be analyzed along with related statistical information about the region.
This meant mapping the whole environment – not just physical landmarks, but population patterns, animal migration routes and land suitable for tourism, compiled together for ease of reference.
It began during a chance encounter with Lee Pratt, then head of Canada Land Inventory, on a flight from Ottawa to Toronto in 1961. Mr. Pratt was tasked with developing a comprehensive map of one million square miles of Canadian land, and took to Mr. Tomlinson’s ideas during the chance meeting. Mr. Tomlinson’s work on that project evolved into the Canada Geographic Information System, and Canada the became the first country in the world to have a computerized GIS.
Mr. Tomlinson’s mapping system is the enabler of our modern computer mapping and global positioning systems. It laid the foundation for Google Maps and GPS receivers in cars.
The Ottawa-born geographic information systems are now used in over 400,000 institutions in more than 135 countries; far from being just an academic success, millions of people are now involved in the $51-billion per year industry.
Mr. Tomlinson is the recipient of The Geospatial Information & Technology Association’s 2010 Lifetime Achievement Award, the highest honour the association can bestow, which recognizes an individual’s lifelong contributions and long-standing commitment to the geospatial industry. He is the principal of Tomlinson Associates, Ltd., Consulting Geographers; clients have included the World Bank, and the United Nations Food and Agriculture Organization.
This meant mapping the whole environment – not just physical landmarks, but population patterns, animal migration routes and land suitable for tourism, compiled together for ease of reference.
It began during a chance encounter with Lee Pratt, then head of Canada Land Inventory, on a flight from Ottawa to Toronto in 1961. Mr. Pratt was tasked with developing a comprehensive map of one million square miles of Canadian land, and took to Mr. Tomlinson’s ideas during the chance meeting. Mr. Tomlinson’s work on that project evolved into the Canada Geographic Information System, and Canada the became the first country in the world to have a computerized GIS.
Mr. Tomlinson’s mapping system is the enabler of our modern computer mapping and global positioning systems. It laid the foundation for Google Maps and GPS receivers in cars.
The Ottawa-born geographic information systems are now used in over 400,000 institutions in more than 135 countries; far from being just an academic success, millions of people are now involved in the $51-billion per year industry.
Mr. Tomlinson is the recipient of The Geospatial Information & Technology Association’s 2010 Lifetime Achievement Award, the highest honour the association can bestow, which recognizes an individual’s lifelong contributions and long-standing commitment to the geospatial industry. He is the principal of Tomlinson Associates, Ltd., Consulting Geographers; clients have included the World Bank, and the United Nations Food and Agriculture Organization.
Thursday, 25 November 2010
Open Government Data - Stakeholders Survey go to: http://survey.lod2.eu
Blogged FYI.....I guess it makes sense to re tweet this is your interested
Dear Sir/Madam, http://survey.lod2.eu
We are inviting you to participate in a survey about open government data. This is part of the European funded LOD2 project, a significant part of which is focused on creating free and open source tools and services to make it easier to find and reuse open government data.
If you are interested in government information (whether as a publisher, producer, re-user or consumer) we would be very grateful for 10-15 minutes of your time to express your views about what you would like to see from the technology the LOD2 project is developing.
The survey will be open until the 17th December 2010 and can be accessed here: http://survey.lod2.eu
If you have any questions or issues about the survey please don't hesitate to contact the responsible person, Martin Kaltenböck by e-mail: m.kaltenboeck@semantic-web.at
Thank you very much.
Yours sincerely,
Martina Eydner
European Commission
Directorate-General Information Society and Media
Directorat E - Unit E2 - Technologies for Information Management
(EUFO 1-294)
Jean Monnet Building
Rue Alcide de Gasperi
L-2920 Luxembourg
tel.: (+352) 4301 32615
fax: (+352) 4301 38099
DISCLAIMER
"The views expressed are purely those of the writer and may not in any circumstances be regarded as stating an official position of the European Commission."
"The views expressed are purely those of the writer and may not in any circumstances be regarded as stating an official position of the European Commission."
Subscribe to:
Posts (Atom)