Part 2 – Big Data: Serving Man for the Global Good

global good 150x150 Part 2 – Big Data: Serving Man for the Global GoodIn our last post, we promised to tell you exactly how big data is being used to serve man by making life better.

The truth is big data really can open up all kinds of possibilities for doing public good.

However, much of the hoopla surrounding the potential of big data has been focused on marketing and other business applications like combining transactional data with information gleaned from social networks to better identify the types of products that consumers are most likely to buy.

But the gleam of big data has caught the eye of humanitarian agencies seeking to identify better ways to accomplish their various missions during crisis situations and for economic development in emerging markets.

For example, the Financial Times notes that because the penetration of cell phones in emerging markets is very high, data that’s mined from cell phone use can be used by humanitarian agencies to identify brewing disasters and better respond when tragedy occurs.

When a massive earthquake struck Haiti more than two years ago, people scattered, leaving aid agencies struggling to identify where to send help. But researchers at Columbia University and the Karolinska Institute started tracking the SIM cards inside mobile phones owned by Haitians to analyze the destinations of more than 600,000 displaced people. The researchers also used the same strategy to route medicine to the correct Haitian locations to try to stifle the spread of cholera.

Aid groups are also beginning to analyze the levels of mobile phone usage and patterns of bill payment because changes can predict rising levels of economic distress.

Big data can yield critical information about the socioeconomic status of populations, according to a report by the UN Global Pulse, the UN’s innovation lab.

The report notes, for instance, that social media chatter has been an early indicator of unemployment in some countries and that the prices of food commodities that are mined from websites closely match the Consumer Price Index in six Latin America countries.

In addition, analyzing tweets that relate to the price of rice is an accurate indicator of the inflation of food prices. References to food or ethnic strife could predict impending famine or civil unrest.

Robert Kirkpatrick, who runs the Global Pulse, told the Financial Times that big data should be used for the global good. And Kirkpatrick dreams of using the data to create the social media equivalent of “‘meteorological stations’ that can test the winds of public debate, spot economic trends and predict looming problems in a beneficial way.”

Recently, Global Pulse and UNICEF collaborated on a workshop to discuss additional potential applications of using big data and analytics for the greater good including:

  • Using microfinance data as a barometer of a region’s health by detailing the economic decision making of poor and marginalized populations.
  • Determining specific human behaviors from cell phone data to evaluate the effectiveness of policy decisions. For example, one researcher analyzed call records to track people’s mobility in Mexico to measure the effectiveness of the government’s warning to people to stay away from public spaces during a flu outbreak in 2009.
  • Using social media to understand youth sentiment about HIV and to target communication for better results.
  • Using data, mobile surveys and behavioral information to develop more accurate profiles of communities – and using tools to speed polio eradication and confirm vaccination coverage.

So, as you can see, big data is doing its part to serve man by helping to make the world a better, safer and healthier place for us all.

Next Steps:

  • Subscribe to our blog to stay up to date on the latest insights and trends in big data and analytics.
  • Join us on August 23 at 1 p.m. EDT for our complimentary webcast, “In-Memory Computing: Lifting the Burden of Big Data,” presented by Nathaniel Rowe, Research Analyst, Aberdeen Group and Michael O’Connell, PhD, Sr. Director, Analytics, TIBCO Spotfire. In this webcast, Rowe will discuss recent findings from Aberdeen Group’s December 2011 study on the current state of big data, which shows that organizations that have adopted in-memory computing are not only able to analyze larger amounts of data in less time than their competitors – they do it much, much faster. TIBCO Spotfire’s Michael O’Connell will follow with a discussion of Spotfire’s big data analytics capabilities.
  • Download a copy of the Aberdeen In-Memory Big Data whitepaper here.


Read this post in source website.

No comments yet.

Leave a Reply