Apps make sense of social media ‘noise’


Social media is the virtual playground where people create, share and communicate instantaneously, and evolving digital technologies are making sense of all the updates, tweets, check-ins, photos and video uploads, according to a Cornell NYC Tech professor.

“Social media has a lot of content, a lot of noise, but for anything important it’s hard to access,” said Mor Naaman, associate professor of information science in the Jacobs Technion-Cornell Innovation Institute at Cornell Tech, where he directs its Social Media Information Lab. “Social media tells us what’s going on, and we can model it in a way that’s easily accessible and usable,” he added.

Naaman met with journalists Dec. 11 at the ILR Conference Center in Manhattan to showcase two Cornell projects that transform New Yorkers social media activity into viable data.

The first, CityBeat, is an interactive application and news ticker that is being beta-tested by journalists at The New York Times and The New York World. The app gathers social media, takes all the geo-tagged content from sites like Twitter and Instagram, and venue check-ins from Foursquare, and extracts useful, usable information from it.

CityBeat features trending New York City happenings, but it can also serve as the equivalent of a police scanner for newsrooms. The app is meant to detect events such as a plane landing in the Hudson or a pop-up protest in real time. It can aid journalists by featuring key information and suggesting eyewitnesses to events.

ParkBeat, an application in its early design phase, will help New York’s Department of Parks and Recreation effectively send resources to its 29,000 acres of land. Currently, the parks department doesn’t have maintenance workers on site 24/7, but social media offers information about what has happened in their parks, noted Naaman.

If picnickers and barbecuers make a mess over the weekend or a mother finds a park slide filled with snow, social media knows. The parks department could use the application’s aggregated meta- and geo-tagged data on social media to send crews to the proper parks, playgrounds, beaches, pools and other public places.

People pay attention to information, capture it and share it – and these applications tell a useful story about that data, said Naaman. As Google revolutionized the Internet by making heaps of information accessible and usable via page rank, Naaman is making sense of and organizing current information.

CityBeat and ParkBeat are for distinct audiences, but Naaman has helped create platforms for the public as well. He’s the co-founder of Seen – a startup that automatically collects, aggregates and organizes, in real time, social media content captured at events. Seen was selected as one of the first companies for The New York Times incubator initiative, TimeSpace.

Several social media platforms exist, but the tools to evaluate what really happened through the cacophony of posts are limited. Perhaps an upload is not relevant or interesting, or is too noisy or too short. “The information is not prioritized in anyway that a viewer can make sense of,” Naaman said.

“Seven or eight years after social media took off, we’re still doing a very poor job representing events in social media,” Naaman added. Using an algorithm Naaman calls AttentionRank, Seen can take 20,000 tweets from a performance or a festival and make them digestible. To date, Seen has “indexed 2,500 events so you can relive them in a way that wasn’t possible before,” he said.

Naaman’s talk was part of Inside Cornell, a monthly series held in New York City featuring researchers and experts for members of the media.

This article originally appeared in the Cornell Chronicle

Previous
Previous

A Gentrified Mind? Ask the Nomad Junkie

Next
Next

If You Build It, Will They Come?