<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=714210352038039&amp;ev=PageView&amp;noscript=1">

P1P Archives

Knowledge Graph – For the Greater Google

Posted by Cory Collins on Sep 11, 2012 2:42:23 PM

Google’s gained ground in yet another innovative step in the form of their new Knowledge Graph. They’re pushing the boundaries of search in the unending quest for user satisfaction—not to mention taming the wild interwebs—leading them further into the murky waters and ruffled feathers of content publishing.

Google dabbling in the realm on content creation is a volatile subject for many, particularly when that content creation directly correlates with their role as a search engine.

To put it directly, many feel their toes stepped on with Google dabbling in content publishing/hosting. Webmasters fear less traffic, ecommerce sites fear less sales, and whole industries quake when considering the impact the giant Google could have if they were to wander into their realm.

Attending SES San Francisco recently highlighted this rise in tension between webmasters and Google. Surprise keynote speaker Matt Cutts, Mr. Google Search himself, was confronted with direct questions and heated responses during the follow up Q & A following his keynote speech.



Cutts, cool as a cucumber, managed to largely dispel the tension. Yet there can be no question that as Google moves forward into the future there will be those who question its direction and fear its motives. Google is such a large corporation, with such vast resources, that any quest into new markets could potentially stifle competition.


Let’s take a look at Google’s Knowledge Graph and why it caused such a rise in tension.


The Innovation

The introduction of Google’s brand spanking new Knowledge Graph has some Webmasters in an uproar. Which, by the way, is rather poorly named in my opinion. The name Knowledge Graph accurately explains the core idea, but it doesn’t capture the spirit of this pioneer step into meshing computer algorithms and human knowledge.

The Knowledge Graph is Google’s attempt at including a deeper understanding into search. Rather than simply (heh) returning relevant websites when queried based upon keywords, Google is attempting to understand the thoughts underneath the search. To not just teach word association, but idea association based upon words. Which, quite frankly, could be an exciting step toward advanced artificial intelligence.

Don’t get me wrong; talking butler robots are hardly right around the corner. But it’s small, nearly imperceptible steps such as this that lay the foundation for extreme advancements.

So, how could a better search have Webmasters agitated? Well, it’s primarily based upon how it functions, and specifically the manner in which they display their new results.

How it Works

First, Google is drawing from a variety of free public sources such as Wikipedia, Freebase, and the CIA World Factbook.

Beyond this, they’re also crawling the web at a rate only Google can accomplish, seeking out relevant content and continually improving the Knowledge Graph.

Once Google has gleaned a group of core ideas surrounding a keyword, they paste the most relevant information into a handy chart to the side of the SERPs.

For example, if you were to Google say U2, a veritable slew of information appears concerning the band U2. This chart includes a brief summation of the band, the members, top songs, upcoming shows, albums, and even similar artists. Basically everything you’d ever need to know concerning the band U2. Which, for webmasters, is exactly the problem.

Why this is a Problem

Many website owners feel there has been an unspoken tenant, a silent agreement, an unvoiced understanding between themselves and Google. An understanding that because Google is a search engine, and only seeks to direct traffic to websites, Google’s allowed unlimited access to these website’s content.

Now, as Google begins to host its own content, parallel to the search engine results pages (SERP)s, there are fears that Google is leaning toward breaking this covenant.

Search engines have been based upon a model of retrieving relevant websites which will, in theory, provide deeper material that might interest users based upon the keywords they used in their search. This function, or algorithm, has become more and more complex over time, increasing the effectiveness of the results, simplifying the search.

This is the next logical step, and it’s a big one. One in which Google directly returns information, no further websites needed.

A controversial statement, I know. And perhaps this isn’t the direct path search will take. But there can be no doubt that the search model is slowly changing. New search programs such as the famous Siri have altered the search field forever.

In this new advanced world, people want to be able to ask a question in search and get a direct answer. Efficiency is key, and having to click through to a website in order to find that answer is beginning to be viewed as a waste of time.

In the words of Google, from the introduction post concerning the Knowledge Graph:

“This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.”

More complex searches will of course continue to draw a list of relevant websites that can provide a more in-depth answer to a searcher’s query. But answers that result in actual facts, such as “What is the temp outside?”, now draws upon the Knowledge Graph to result in a quick visual answer.Which means that, in theory, websites may be receiving less traffic in the future from Google. Worse yet, websites that specialize in niche knowledge—say celebrity facts, weather forecasts, geographical data, etc. etc.—are in a position to take the biggest hit, since Google can just mine the data and then host it in their Knowledge Graph.

This is all theory thus far of course. Google still provides a large portion of traffic to websites every day, and there isn’t much data concerning how often the Knowledge Graph is utilized and satisfies queries.

The Great Google

Google justifies these new search metrics as being best for the user, which is a valid rationality. But one can’t help but realize that this is meant to keep Google ahead of its competitors as well. This begs the question where is the line?

Google is famous for its, “Don’t be evil”line. And there’s nothing particularly sinister about their evolution as a search engine. Yet, there can be little doubt that this step toward Google’s own good—as well as their users—might be piggybacking on the efforts of webmasters.

It’s also hard not to look at the direction Google is going without skepticism.It is extending so far out into various fields that personally I can’t help but wonder where the point of dilution occurs.

There’s a thin line somewhere here in the relationship between search and answer. Webmasters have worked hard providing the best, most relevant answers to appear in Google’s list of results. But now Google is providing answers, which has the potential to crowd out websites and freeze necessary traffic.

Where’s the line drawn between helping users and hurting webmasters? I’m not of the opinion we’re there yet, but I think it’s foolish not to see the actions of today and wonder at the consequences of tomorrow.

It’s an interesting question, and unfortunately there’s no clear cut answer—yet. What do you think?


[author] [author_image timthumb='on']http://pageonepower.com/wp-content/uploads/2012/10/photo-11.jpg[/author_image] [author_info] is a head writer, web content developer and team leader at Boise’s Page One Power. Collins is passionate about SEO, link building, and other white hat practices, and writes about it for Page One Power’s Link Building News and countless online publications. Connect with Collins on Twitter or Google+.[/author_info] [/author]