The Future of Search Panel Recap - A Page One Power Webinar
Hello everyone and welcome to the video recording and recap of Page One Power's Future of Search Panel webinar.
First, thank you to our excellent panel for sharing their time, expertise, and insight with us! Our panelists were:
- Russ Jones, Principal Search Scientist at Moz.
- Eric Enge, CEO of Stone Temple Consulting.
- Jono Alderson, Global Head of Digital at Linkdex.
- and Nicholas Chimonas, Director of R & D at Page One Power.
This was one of the best conversations we've had the pleasure of hosting, with insight, humor, and concrete advice shared by all.
Note: this recap will be a paraphrase of the conversation, not a direct transcription. If you're interested in hearing the actual conversation, I'll embed the video throughout.
The webinar panel lasted one hour, during which we covered a myriad of topics, ideas, and insights. A lot of ground was covered.
To make it more digestible for you, I've broken it down here. Videos are embedded at the beginning of each question, with yellow subheaders to denote the topic of conversation within the question.
I've also used blocktext to pull out pieces of the conversation that were especially interesting. Remember, these aren't direct quotes.
If you want to simply watch the whole video, look no further:
Here are the questions the panel discussed:
- How search evolved from the beginning of their career.
- What search features they're most interested in today.
- What they predict to be the next innovation to impact search.
- How they identify meaningful trends in search versus experiments that die on the vine.
These questions served as a guide for the conversation, not a script.
Oftentimes our panelists asked one another questions and found tangents to wander down. The result is a interesting, fun conversation that isn't scripted.
Question One: How has search evolved from when you first began your career?
The discussion begins at 1:26.
Nicholas: Let's start with Eric Enge as he's the most seasoned veteran here.
Eric: When I began in search it was so incredibly easy to rank. We started an exact match domain (online-motorcycle-parts.com) and within seven days of launching we were ranking #7 in Google.
It was very keyword-centric.
There wasn't much notion of authority, it was mostly a relevance-based algorithm. It was easy to fool just by repeating a keyword over and over again.
Today there are considerably more factors.
You now have to take a holistic approach to ensure you'll not only survive but be a winner three to five years down the road.
Back in the early 2000's you didn't need to do that.
Content, Links, and Query Disambiguation
Nicholas: Russ, let's get your perspective. Where did you come into this game and where do you see it going?
Russ: I'll echo what Eric said. I was on track to become an attorney, as my twin brother is, and my older brother is, but then decided to stick with this SEO thing.
My twin brother asked me "why are you doing this?"
My answer was simple: "stupid people can make money in SEO."
I don't think that's the case anymore. You have to either have intelligence or investment to succeed.
At one point I had a double dash domain. I had hearing--aids in my URL. Apparently that's legal.
The core factors that began search engines have remained largely the same: relevancy and authority, content and links.
There's been an ebb and flow of which factor seems to matter more, and I think that's actually more about the supply and demand. How hard is it to get content versus how hard is it to get links.
One big and important shift over the last year or two is the range of changes made to Google dealing with query disambiguation—Hummingbird and RankBrain.
One big shift over the last few years is how Google deals with query disambiguation—Hummingbird and RankBrain.
If there is a shift bigger than Panda or Penguin, it's that there isn't a 1:1 ratio to the keyword searched and the optimization on the page.
It's still going to be about content and links, but what qualifies as a good link or good content will be broadened by the way Google understands language.
If there is one big shift that is bigger than even Panda or Penguin, it's that now there isn't a 1:1 ratio to the keyword searched and the optimization on the page for that keyword.
That's where we're going to see the biggest shift in SEO over the next few years. It's still going to be about content and links, but what qualifies as a good link or good content will be broadened by the way Google understands language.
Eric (joking): as long as you can say RankBrain took over the whole algorithm I'm good with that.
Nicholas: That's a really good point. Essentially, it's always been about the fabric of the internet: content and links. That's the internet.
What's really changed is Google's ability to understand keywords, and disambiguate all the variations.
What's the stat of how many unique queries Google sees per day?
It's mind blowing that Google receives that many unique searches per day.
Nicholas: Alright Jono, let's pass it to you.
Google's Influence on the Web
Jono: Controversially, I think it's a bit broken.
I started as a techie, entirely self-taught, and became obsessed with the idea of how do you do this better, what's the perfect title tag, should I use alt attributes, and became a technical SEO without knowing it was thing.
It was so simple, and so purist.
Now I've fallen into the world of links and authority signals, and it's become frustratingly complex. That complexity has spiraled.
Google's original mission of "organizing and categorizing the world's information" has morphed into shaping it, because their systems aren't capable of truly understanding relevance and authority.
Now they're left with a world where they're imposing what good marketing and branding looks like, which leads to awkward coexistence of organic and paid, and Google defining the moral code of websites.
For example, guest posting and directories. Intrinsically, these are neither good nor bad. Our reaction to Google's model has shaped the fundamental makeup of the web, which is terrifying.
So we're left in this place where, for example, artisanal crafters have to be great marketers and great writers in order to sell their hand-carved wooden furniture, rather than relying on the quality of their product.
Ecommerce websites with big budgets and big websites can churn out content and generic content.
It's a bit broken at both ends of the scale.
It's so important for Google to better understand context, intent, and meaning so they can cut out this artificial model of what relevance looks like.
We're almost at the tipping point where the entire way we've thought about search and generating relevance and authority signals will no longer be applicable.
Google will be phenomenally more capable than we'll ever be at understanding the web.
We're almost at this tipping point where the entire way we have thought about search and generating relevance and authority signals will no longer be applicable. Google will be—via RankBrain—phenomenally more capable than we'll ever be of understanding what good looks like.
The Tragedy of 10 Blue Links
Russ: I completely agree.
We had this conversation a while ago, with Mark Traphagen as well, about the tragedy of ten blue links.
The premise of Google is to consolidate recommendations down to ten blue links based upon what a person is searching for. It's potentially dangerous.
I just searched for restaurants where I'm located, and it recommended three decent restaurants in my area.
But if someone came up to me, and asked me about good restaurants, I'd immediately ask them back "well, what are you looking for?"
I wouldn't try to interpret an answer from the data they've given me. I would say "I can't give you a good answer, without more information."
We see this across the board with search engines. It's shaped the success and failure of businesses, it's caused consolidations, and I'm not really quite sure what the solution is.
Eric: Personalization is a big part of this story.
Search engines can and do learn over time that I tend to go to the Italian restaurant around the corner.
So if I'm in a new city they might be more apt to suggest an Itialian restaurant.
They can try to include personalization.
It's an interesting point you bring up Russ, that we don't want normal human interaction with a machine. Maybe that will change as we get into higher levels of personalization, and interact more with voice search.
Nicholas: There's also the risk of self-confirmation bias with over-personalization.
Eric: Right. Serendipity is good; we like to be surprised by new things.
Nicholas: We should—we should want to broaden our horizons.
For example, this person is a republican so let's only feed them republican search results. Would that really be the best thing?
Google would then be influencing people in a way they should not.
Jono: The continued existence of the ten blue links, in whatever format they are, is hugely important because there has to be some sort of process in between a raw query lacking context.
There's a huge risk with personal assistants and things like Google Now, as they continue to take off, we'll lose transparency.
That will push us into the world where the winners are the people who do a large flashy PR campaign to simulate relevance. We'll move more into a place where the big budgets win.
Big Budgets Versus Agile Marketing
Eric: This notion that people who do really effective marketing—which can involve big budgets—will win isn't unique to the internet.
This has been true for 30 years.
There's always a new opportunity for an aggressive, fast-moving, small player to build a net and build a niche which is distinct from everyone else.
Big budgets are the easy way to build visibility, but if you're the first big player on a new social network that creates a huge opportunity, which isn't something that was around 30 years ago.
Nicholas: Absolutely. Jono mentioned earlier that Google is transforming the very nature of the web—such as Nofollow, authorship, and AMP.
This is a good segue into our next question.
Question Two: What search features are you most interested in today?
AMP: Good for Users, Here to Stay
Nicholas: I'd like to kick this off with a discussion about AMP, and whether the panel sees this as a long-lasting element, or whether it will go the way of the dodo.
Eric: My opinion is that AMP has legs.
Our testing shows 71% reduction in page size across a few sample pages. Also, pages that were scoring a 42 in Google's PageSpeed Insights rose to an 88. It's dramatic.
There's implementation issues in AMP, with the stripped-down version of HTML, but you have to do something to make your pages faster for mobile.
Speed in mobile is a big, big deal.
Jono: It's worth considering Google's commercial model when you consider AMP.
Google monetizes people searching and consuming content.
In the context of mobile, that's a poor experience. 30% of Google's Adwords revenue is from mobile searches—that's a huge chunk of cash to be on the line, if people have poor experiences on mobile. They'll instead go to apps.
It's imperative that Google keep searchers in the search ecosystem.
AMP is a very clever play by Google to maintain the normalcy of search.
It's a commercial play—as adoption grows, it's not far-fetched to imagine Google taking an aggregation of AMP pages and creating a hybridized experience, where Google takes components from those and builds their own solution. Google would be the interaction layer on top of the web, and all the websites behind them would essentially be APIs and data providers.
Russ: I have a lot of mixed feelings about AMP.
I remember the first time I saw an AMP result and clicked on it. The only way to describe it is the same amount of magic I felt when I moved from a 56k modem to cable internet access and saw the change of speed.
In that respect, there is something to be said that AMP is good for users. But AMP is definitely good for Google, as well.
Google gains control over the content for the mobile web, which gives them more control over their revenue.
If it's good for Google, and it's defensibly good for users, I think it's going to have sticking power.
I can't say I'm a big fan of AMP, but I can only give in-principle arguments against it. I can't give practical arguments against AMP.
The people who are upset with AMP—myself included—are upset with giving Google so much control—not upset with having really fast web pages.
I think AMP is going to have more staying power than Google+.
Google Authorship: Training Data for Google?
Eric: We began the conversation with "will AMP have more staying power than Google Authorship?"
Don't overlook the whole possibility that Google Authorship was just a training process for an algorithm to help them better understand authorship. It may have been a complete success for Google.
Jono is right about Google wanting to get searchers away from the world of apps.
Google is doing well in apps, by the way—they have 6 of the top 9 apps, and Facebook has the others except for one. Between Google and Facebook they have 8 of the top nine apps.
But apps aren't as friendly to Google.
The Fragmentation of the Web
Nicholas: I'd like to interject with a thought from the audience.
The implementation of AMP is convoluted. Small companies have to hire developers with experience in AMP. Plugins from WordPress don't answer the requirements for AMP.
Only companies with big budgets—or those who understand the constantly evolving landscape and are willing to hire the right person—have a significant advantage. The technology that Google rolls out will continue to give the upper hand to the bigger companies.
Search features in general have continued toward this trend. Would you agree?
Jono: This is a short term challenge—it's actually much bigger than AMP.
As content consumption continues to fragment, the challenge is how do I transform myself from being a business with a website, to being a business that has content, that is deploying into multiple locations.
The challenge is how do businesses transform from just a website to a business that deploys content across multiple locations.
Businesses will need to pivot their output into 6 different apps, rather than just a website.
As an entrepreneur, when I hear conversations like this I immediately think: opportunity.
I can gain an edge on big companies with big budgets if I can be nimble and quick, and implement these technologies before they can. I can gain some traction, market share, and grow my business. I love these situations.
Nicholas: I agree with that. But, when you talk to the average small business owner, it's rare they even know these things exist.
With AMP results, it says beneath the result in search "AMP" and has the lightning bolt in the circle, but if you click that it doesn't take you anywhere. There's no explanation to the layman.
Perhaps searchers realize the page loads quickly. Perhaps they research AMP for themselves. But the barrier to entry is higher.
I've seen eBay start using AMP pages. It's strange because first you'll see an intermittent page that shows an eBay product, but if you click around it takes you to regular pages that load more slowly. There's a bit of friction there.
Jono: Absolutely. With AMP it's almost inevitable that someone is going to have a broken experience. When they come in through AMP it loads quickly—it's excellent—but as they move around they're pushed into a mobile site that just falls over.
Eric: Commerce and retail type pages aren't really formal in AMP at this time, so it's interesting eBay pages are beginning to show up.
Retail isn't an announced component of AMP. It's largely for news and content pages.
SERP Features: Volatile and Varied
What interests you now and what's in the coming future?
Russ: I don't get to work with Dr. Pete as much as I'd like, but we do have a decent amount of cross-over.
The most interesting part of SERP features isn't so much what the feature tells us about the SERP, but rather what the SERP feature reveals about the query.
We tried to determine the relationship of individual SERP features, and groups of SERP features, and the likelihood a search user would click an organic result.
As you can imagine if there is a giant map on a local query, the odds that a searcher would click an organic result is lower than if there were no map in the first place.
Common sense intuition would be that any addition of SERP features to the page would decrease the click-through rate (CTR) to organic by some margin. But that's not what we found, in some situations.
For example, when a SERP has a the features knowledge panel, site links, and tweets, it corresponded with a near 100% organic click-through rate. Very high CTR, above what you would expect.
But then if you go look at the only queries that have all of those features, they're all major, active brands.
So what is happening is everyone is clicking on the branded organic link. That's happening nearly 100% of the time.
So the SERP features aren't impacting what the user did, but instead revealing information about the query—these are active brand searches.
On the other hand, we see wide volatility for other features.
The standard featured snippet box can have almost no impact, or it could remove nearly all organic traffic—for example the conversion calculator.
If you search CNY to USD to convert the Chinese Yuan to U.S. dollars, and look at the Clickstream data you would see about a 0% click-through rate to organic, because everyone just goes ahead and uses the calculator on that page.
But other featured snippets that are the same size, diameter, take as much page, might only impact organic by 1%.
So there's this huge variety of ways in which SERP features can influence organic click-through rate.
If you search CNY to USD to convert the Chinese Yuan to U.S. dollars. If you were to look at the Clickstream data you would see about a 0% click-through rate to organic, because everyone just goes ahead and uses the calculator on that page.
But other featured snippets that are the same size, diameter, take as much page, might only impact organic by 1%.
So there's this huge variety of ways in which SERP features can influence organic click-through rate.
I'm very much interested in following those trends and building models that help users determine the right keywords to target, based on the likelihood that a user will end up clicking on an organic search result.
Nicholas: That really speaks volumes about optimizing for intent, beyond the typical keyword research and optimizing for volume and competition level.
Every different query has different intent.
There's sweeping advice "don't target this query because there's all these SERP features"—and that can be true, but also might not be true depending on the intent of the keyword.
That's in line with what RankBrain is doing, trying to understand the query, and I think that makes sense for us to do as SEOs as well.
Eric: Then there's the idea that "okay, there's all these features, but can I play in these features?"
There's things you can do to increase the likelihood of being in a featured snippet.
If I'm currently ranking #7 in the result and I modify my content, it may not change my rankings in a traditional sense but suddenly I'm in the features snippet as well, ranking position #0.
So that's another way to think about it: leverage those new features to your advantage.
Nicholas: That's how small players are beating out big players—targeting a snippet.
Jono: We see people tactically target snippets when they're at the bottom of the page and actually do better than they would if they were to get to position #3 or #4.
Nicholas: Sure. Well I think that leads us nicely into question three.
Question Three: What is the next innovation to impact search? How do you expect search to evolve over the next 5 years?
SEO: Fragmentation, Digital Assistants, Brands
Nicholas: You want to take it from there, Jono?
The really cool stuff coming up is personal assistants and preemptive search. That includes Google Now on your phone or Google Home or Amazon Echo.
The challenge for us is there is no search. As search professionals we'll need to switch our focus to earning recognition and brand preference before people search, so when they do have needs, you're the brand of choice.
That's going to be an interesting challenge, given it will change the way we operate and report.
The fragmenting market place will be another challenge. Places like Etsy and eBay and in-app transactions will mean that less and less interaction and consumer behavior happens in our big, fortress-esque eCommerce sites. It will be increasingly distributed.
Also, consumers are becoming more educated.
You've got multi-device, multi-visit behavior, people looking at reviews, understanding pros and cons, and building complex brand preferences.
All of this will make it really hard to distill down to something as simple as "how many visits did I get?" or "where am I ranking?"
Our big challenge as SEOs will be moving from asking "how is my website doing?" to "what's the experience like when they search, and how do I ensure I'm well represented or promoted in some sort of preemptive search?"
All of this will converge and require us to fundamentally change the way we think about search from "how do I get higher in the results?" to "how do I understand what people think, and what they feel, and therefore what they do?"
Digital assistants, preemptive search, a fragmenting market, and growing consumer options make search more difficult.
We will have to move from "how is my website doing?" to "what's the experience like when they search, and how do I ensure I'm well represented or promoted in preemptive search?"
It's going to be a fun game.
Reporting: Not Provided, Shrinking Data
Nicholas: Reporting is an unsolved problem that only gets worse as time progresses.
There's a great presentation by Rand Fishkin I saw at SearchLove. He presents this idea that traditional reporting was much easier when we had keyword data in Google Analytics and not provided wasn't such an issue, and Adwords wasn't restricting data, etc.
How do we move into better reporting, as the data seems to only shrink?
Jono: It's an interesting challenge, isn't it?
It puts SEOs much more in line with where traditional PR and TV have been forever. Ironically SEOs have spent the entirety of our existence pointing and laughing at how traditional media measure results, and how much more sophisticated our reporting was.
Traditional media still gets more budget than us, and now we'll have to learn how they report, how they get buy-in.
We might have to look to the old world a bit.
Eric: We're looking at massive fragmentation that's going to happen in device types.
By the year 2020 more than 75% of the connected devices will be something other than a smart phone, tablet, or traditional PC.
That's a staggering revelation.
Many of those devices you won't interact with on a regular basis, but I expect to see more voice interaction and digital assistants that will help you track your work across devices and platforms.
It's going to be a very dynamic environment.
When you step back and think about what this means, I can't imagine there's a path to success unless you've built some level of brand—even if it's a local brand.
The Tragedy of One Answer
Because even if you try and operate in a world where your goal is to rank higher, how do you rank higher in what?
If you're asking Google Now or Alexa to order you a pizza, the only thing you can do is attempt to have brand preference, be relevant, be known, and have a relationship with the consumer.
There suddenly isn't a world where you can say "I'm six out of ten" or any of those metrics.
Russ: I'm going to bite the bullet and play devil's advocate.
I agree with the importance of building a brand, because it continues to pay dividends in the future, even if you stop investing in it as you did earlier.
There's another problem with fragmentation, which we discussed earlier.
If a small business tries to be a jack of all trades, across all the platforms that are relevant to them, they will fail, because they've divided their resources too broadly.
There are going to be smart choices to focus on. It should come from good data and good reporting.
The example of ordering a pizza from Alexa is really important. If we start seeing the consolidation of the tragedy of ten blue links moving to the tragedy of one Alexa answer—whoever succeeds there, gets the whole pie.
If we see the consolidation of the tragedy of ten blue links move to the tragedy of one Alexa answer—whoever succeeds there, gets the whole pie.
Imagine if you have ten competitors in five different spaces. If five of those competitors tried to evenly compete across all spaces, and the other five target heavily, there's a real possibility that the ones that decided to spread their marketing dollars thin end up not ranking #1 in any of them. And, in fact, they might rank #2-6 in all of them.
For example, we had a client come to us about 5 years ago, and we asked where he got his business from—because they weren't ranking in Google.
He said "the Yellow Pages".
I was shocked, because who uses Yellow Pages anymore.
He went down the numbers. The average Yellow Pages that makes its way into a house sits there for about six years. Which means they still get calls from Yellow pages delivered six years ago. Also, unlike any other market place out there, if I pay enough money I can own the entire first page.
So there are opportunities to dominate one vertical—one fractured environment—and some of the businesses that are most successful will see these opportunities and focus on them, rather than be all things to all people and all devices at all times.
There are opportunities to dominate one vertical—one fractured environment—and some of the businesses that are most successful will see these opportunities and focus on them.
Eric: I completely agree with that, and nothing I said was meant to suggest you shouldn't focus your dollars. Especially for a small business it's important to focus on one or two things and do them extremely well rather than do a miserable job at a whole bunch.
If you have a strong brand, when people go to a new device, they will intrinsically look for you, because they have that association.
Google: Search Experience, Not Search Answer
Nicholas: So we've discussed the tragedy of ten blue links, and the future possibility of the tragedy of one answer.
We've seen some of this with instant answer boxes and knowledge graph panels, but at the same time they're keeping the organic results on the page.
Are we safe from search being distilled down to one answer, because it is the best way to display results? Google understands it's human nature to be curious, and want more than one answer. Or are we already on our way to idiocracy?
Are we safe from search being distilled down to one answer? Or are we already on our way to idiocracy?
Eric: I don't think so. I really don't. There's a tremendous amount of commercial pressure to not do that.
Even with the strongest results, the first result receives about 42% of the clicks, which is a BIG number. That means 58% went somewhere else.
There's a major disincentive to get down to a single answer. If 58% of the people are dissatisfied with the answer, there's too many other places where I can go to find an answer I like better.
Jono: It's worth considering that Google's objective isn't to get users to the best page, or even provide the best search results—Google's objective is to provide a good search experience.
So even if Google could predict that #1 best answer 98% of the time, there's a real risk that it compromises the search experience and Google loses users.
Russ: To play devil's advocate: I think you're right in terms of general organic. But voice search is a really interesting phenomenon. People don't want a list of things read off to them.
If I ask for directions to the closest CVS, it will take me to a CVS that is about four miles away from the CVS I use, for no other reason than Google doesn't like that CVS. I say that because it's listed, it's there, and it is closer, but Google gives me the other one.
Same thing with the bank I use. My wife once swerved across four lanes of traffic because we saw another bank on the way to the one Google was directing us to.
As long as users don't know they're getting an unoptimized, or imperfect answer, they'll assume they're getting the right one.
Google will never get a thumbs down—or whatever metric they're using—if people don't realize Google got it wrong.
You have to follow their GPS directions off a cliff before people realize Google got it wrong. How do you know that the directions they gave you were the fastest directions?
Nicholas: That's a good point. Google has said time and time again that they're trying to build the Star Trek computer. And when the Star Trek computer gives an answer, it gives one answer and without attribution.
Russ: If they build the Star Trek computer I'll be happy. I won't complain.
Nicholas: I think we have time for one more question here.
Question 4: How do you identify trends that will have a lasting impact in search versus experiments that may die on the vine?
The Litmus Test: Repeatability
Nicholas: Much of what we've been through is data farming for Google, Authorship being an example.
Russ, why don't you kick us off?
Russ: I use the litmus test "how repeatable is this tactic?"
If you ask how easy it is for a competitor to reproduce a tactic or strategy, then you have your answer. Even if Google isn't able to stop it, as long as ten or more competitors do the same thing it becomes as valuable as putting a keyword in your meta keywords tag. Everyone does it, and it's no longer valuable.
How hard will it be for a competitor to reproduce?
The Multi-Keyword Problem
Nicholas: You've talked a lot about RankBrain and how to interpret it the right way, which is that it's understanding queries better.
Russ: That's a perennial question in SEO—I call it the multi-keyword problem.
You only have so many characters in the title, so many in the description, and so many in a reasonable page length. And now that we've got a lot of keywords and related words to optimize for, as opposed to the old model of one keyword for one page, how do we create a thorough, relevant piece of content that broadly matches the different interpretation layers that Google has created?
I know we're going to see tools around this multi-keyword problem.
Nicholas: I must tip my hat to Moz's Keyword tool.
I absolutely love the different ways you can tell the tool to interpret the word you've given it. If anyone in the audience hasn't used the tool you must go try it.
Eric, what do you think. How do you identify if a trend is meaningful or or an experiment.
Perceived User Value
Eric: From my perspective, you have to ask "what is the perceived user value"?
With authorship, the value was low. Getting the author's picture in the search results is low for the user.
With AMP, even Russ admits the page is fast and is high for the user.
That's how I try to predict whether a search trend is going to have staying power: perceived user value.
Nicholas: How does that affect what you do? Do you go for the early adoption just in case it takes off?
I've heard that opinion on AMP from other marketers.
Eric: Well, the short answer is I have an AMP version of the Stone Temple Website.
Having said that, there's a cost to value calculation.
With Authorship, the perceived benefit to users wasn't enduring, but the implementation was pretty simple.
You have to look at the cost value trade off. AMP is definitely a higher mountain to climb to implement, but the perceived benefit is much higher. I believe it will be around much longer.
Google Recycles Tech
Nicholas: Jono, why don't you wrap us up with your thoughts?
Jono: I'm actually bad at this. I get so excited about all the new kit. I was ludicrously excited about Google Wave.
One of the challenges is that Google is terrible at marketing Google stuff. So Google Wave, Google Glass, Google Authorship, in a different world might have changed the world and been hugely successful.
What's interesting is that if you judge these on adoption you miss the bigger picture. For example, Google Wave is now the backbone of Google Doc's collaborative editing, Authorship—although no longer used—certainly feels like it's part of Google's kit, the success of G+ as an identity platform but not social network is interesting.
For years and years Google said not to give meta keywords or tell us what your content is about, we'll work it out for you. Now it's a complete pivot to use Schema, JSON-LD, use open graph text.
There's a lot of difference between day one and seeing it in retrospect.
The key is looking at whether a trend improves search experience, builds a competitive advantage, and what's Google's motivation.
As practitioners, if you solve it with money, outsource it, or scale it easily it's probably going to die. The value will diminish.
As a philosophical view, how close is it to good?
For example, if you own a chain of restaurants. Should you be building campaigns and infographics, or should you be training the chef?
Google will always prefer you to train the chef, because that will correlate with all sorts of good signals, and all sorts of good outcomes. It may not have a direct result on search, but it's closer to good.
Nicholas: The real question is were those failed experiments? Did Google learn something out of Authorship that they're able to use?
In terms of machine learning, perhaps we provided the training set data.
Jono: Absolutely—that's not even conspiratorial. If you worked at Google and were in charge of the next Penguin update, that's exactly the type of data you'd want.