MediaPost Blog – SearchBlog
by Laurie Sullivan

Members of the generation raised on Google wants search engines to know how they think and feel. They want visual search, and for engines to serve them ideas based on personal information stored in social sites across the Web.
For example, when they search for movies after they’ve had a bad break-up, they want search engines to filter out romantic comedies. That might mean processing brain waves and reading physical movements through a PC camera, similar to Microsoft’s Project Natal.
That’s how a handful of Ball State University students would rethink how they find information online and offline, as well as redesign search on a variety of platforms and devices, according to Jen Milks and Michelle Prieb, project managers at Ball State University. The two shared their findings during the closing session Saturday of MediaPost’s Search Insider Summit.
The students considered older generations, such as 75-year-old Aunt Bess who wants to upload and share Cindy’s wedding photos. They believe search should become easier for others to understand and use, too.
Verbal communication would also integrate into search, according to the students’ redesign, and provide a continual feedback loop. Both parties would contribute to the conversation equally, with searchers telling the engine likes and dislikes through their preferences, social sites, email, Web browsing history and blogs read.
All the information in these sites tells the engine about the students, which gives marketers the insight through search queries into what they want to see. The clarification comes in questions while searching. “Are you looking for an engagement ring for yourself, or are you helping someone do the research?”
The students also want an option to make the search process either targeted or exploratory, depending on if they specifically know what they want. Perhaps Google should replace the “I’m Feeling Lucky” button with “I’m Feeling Adventurous.”
Similar to how Pandora analyzes the composition of songs based on prior picks, the students want search engines to analyze the composition of their virtual identity and serve them recommendations based on that process.
Integrating social data into search would increase targeted searches and targeted ads. Milks called keyword searches imperfect. But what if search engines could pull from data in social sites, including Twitter, input and made public by the person doing the search, as well as recommendations from friends?
Cloud computing will enable people searching to gain power on a variety of devices everywhere, from PC to mobile to television. They want images and visual search, touch, voice activation, and augmented reality.

Members of the generation raised on Google wants search engines to know how they think and feel. They want visual search, and for engines to serve them ideas based on personal information stored in social sites across the Web.
For example, when they search for movies after they’ve had a bad break-up, they want search engines to filter out romantic comedies. That might mean processing brain waves and reading physical movements through a PC camera, similar to Microsoft’s Project Natal.
That’s how a handful of Ball State University students would rethink how they find information online and offline, as well as redesign search on a variety of platforms and devices, according to Jen Milks and Michelle Prieb, project managers at Ball State University. The two shared their findings during the closing session Saturday of MediaPost’s Search Insider Summit.
The students considered older generations, such as 75-year-old Aunt Bess who wants to upload and share Cindy’s wedding photos. They believe search should become easier for others to understand and use, too.
Verbal communication would also integrate into search, according to the students’ redesign, and provide a continual feedback loop. Both parties would contribute to the conversation equally, with searchers telling the engine likes and dislikes through their preferences, social sites, email, Web browsing history and blogs read.
All the information in these sites tells the engine about the students, which gives marketers the insight through search queries into what they want to see. The clarification comes in questions while searching. “Are you looking for an engagement ring for yourself, or are you helping someone do the research?”
The students also want an option to make the search process either targeted or exploratory, depending on if they specifically know what they want. Perhaps Google should replace the “I’m Feeling Lucky” button with “I’m Feeling Adventurous.”
Similar to how Pandora analyzes the composition of songs based on prior picks, the students want search engines to analyze the composition of their virtual identity and serve them recommendations based on that process.
Integrating social data into search would increase targeted searches and targeted ads. Milks called keyword searches imperfect. But what if search engines could pull from data in social sites, including Twitter, input and made public by the person doing the search, as well as recommendations from friends?
Cloud computing will enable people searching to gain power on a variety of devices everywhere, from PC to mobile to television. They want images and visual search, touch, voice activation, and augmented reality.

Advertisements