The Future and Not-Future of SEO

JCash

Oldschool IM
Nov 25, 2010
2,243
96
0
The 702
www.youtube.com
Personalization - "Do Not Track" notwithstanding, SERPs will reflect your location, and your online activity more and more. Measuring uncookied is already starting to be a pointless excercise.

Navigation Aids - Google Local was just the first. The trend is for Google to give a specific search result type, or "navigation aids" wherever it can. The day will come where Google figures out if you're searching for news or a product, and show something similar to their current shopping and news pages. Worse news, if you appear to be doing general research, you'll get results similar to QWiki's current offerings. One of Google's most talented people left in order to build a better search engine, Cuil. The guy left Cuil (and killed it by doing so) to focus on what became QWiki, but Cuil kept the sort of prototype auto-wiki that gave encyclopedia-style responses to queries. Who just bought the remanants of Cuil? Hint: starts with G. Guess who's starting to serve wikipedia-style navigation pages? Also starts with G. For example if you search for my great-uncle Reggie you might see:

agtzfmxvbHNlcnYzNHITCxILU3RvcmVkSW1hZ2UY2Z4IDA


Objective - Starts with G is starting to realize there's some hubris and unavoidable bias in ranking web resources, and is starting to introduce more random flux with the planned flux. The "reverse sandbox effect" is very useful as it allows them to factor clicks into ranking, but there's also advantages to them taking approaches like having 50 results rotate through the top 10 or letting msnbc, cnn, foxnews and hln round robin for #1 for "cable news."

The not-future? This "social indications" crap. Why would Google need to pay attention to Facebook likes, pluses and tweets in the light of the aforementioned? Why with something so easily abusable (they hate the backlink market, don't you think they'd equally hate the sellers of plusses and likes and social bookmarks?) would they give it any credibility? Not to mention they tried letting the booboise get their jabberings in, and turned it off already, remember Google Realtime? The spam and noise squelched the value, and they declined to renew their agreement with Twitter, btw gg guys. All that might happen is your social connections' online activity might influence what you see in your search results, but the idea that plusses will get you #1 for a keyword just doesn't make sense.

Let's argue!
 


If they're going to a crap load of weight to clicks and time spent on the page whats to prevent you from getting a botnet and simulating users? I'm referring to a legal botnet in this case, but illegitimate or not it should work correct? If it does work you could possibly even fuck over your competition by giving them poor user engagement. If it does work like that lol @ them not learning from mailers. If anyones wanting to try it I'd be willing to possible write the software for fun. That's more or less same thing we use to do with email.

EDIT:

If they're going to even localize the click you could concentrate on specific areas that would naturally get the most clicks. For example have botnets setup with IPs that geolocate to LA, NYC, Miami, Houston, Denver, etc. The trick would be getting the ips to appear to be residential, but even that isn't really hard if you know where to look.
 
The "reverse sandbox effect" is very useful as it allows them to factor clicks into ranking, but there's also advantages to them taking approaches like having 50 results rotate through the top 10 or letting msnbc, cnn, foxnews and hln round robin for #1 for "cable news."

I had actually assumed that the current algo took CTR into account as a ranking factor.
Also, out of the 9 threads you've started, this and Let's Play Chess are my two favorites.
 
If they're going to a crap load of weight to clicks and time spent on the page whats to prevent you from getting a botnet and simulating users? I'm referring to a legal botnet in this case, but illegitimate or not it should work correct? If it does work you could possibly even fuck over your competition by giving them poor user engagement. If it does work like that lol @ them not learning from mailers. If anyones wanting to try it I'd be willing to possible write the software for fun. That's more or less same thing we use to do with email.

EDIT:

If they're going to even localize the click you could concentrate on specific areas that would naturally get the most clicks. For example have botnets setup with IPs that geolocate to LA, NYC, Miami, Houston, Denver, etc. The trick would be getting the ips to appear to be residential, but even that isn't really hard if you know where to look.

Thanks for sharing.
g.gif
 
We all know now that the PRIMARY purpose of Google's manual search quality reviewers is to do experiments and create feedback for their algorithm, e.g. "all the sites that the manual reviewers flagged seem to have these 52 traits in common, let's algorithmically flag other sites that have this collection of traits".

My opinion on the future of SEO is that this "experiment-feedback-implement" process of Google will become increasingly faster, and ultimately so fast that spammers won't be able to "keep up".

So far it was a case of "Google update -> spammers adapt -> Google update -> spammers adapt..." and so on. This meant that the typical traits of web spam has been constantly evolving. Many people believe that no matter how many updates Google goes through, it will never be able to make its algorithm "bulletproof".

It doesn't have to. The speed at which Google is changing its algo has been increasing for a while now, and many SEO'ers who felt the squeeze become sick of the increasing fast-paced cat-and-mouse game and left SEO altogether to invest in more future-proof marketing activities (there are many such confessions on this very forum).

The pace of this game of cat-and-mouse will reach a critical point where the typical "SEO flavor of the day" will literally last a day (even less) before Google's algo adapts to it. Google will become an enormously dynamic machine that adapts ridiculously quickly to the most popular webspamming techniques at any given time.
 
Ok so that covers 0.1% of searches. You have any info on the other 99.9% nigga? Stop acting like Google knows what the fuck they're doing plz. thx.
 
If they're going to a crap load of weight to clicks and time spent on the page whats to prevent you from getting a botnet and simulating users?

The bots would have to be cookied, and have a usage and engagement history, and that usage and engagement history would have to be favourable to whatever the SEO goal is, and the clicks would have to not appear to be bounces.
 
I had actually assumed that the current algo took CTR into account as a ranking factor.
Also, out of the 9 threads you've started, this and Let's Play Chess are my two favorites.

It probably does although how it does is hard to analyse, I just know my server logs and GWT seem to show correlation between me updating sites and briefly being #1 for a given term, and then perhaps some combination of low CTR or high bounce rate getting it relegated.
 
I really don't like to think I care enough about SEO to go into these discussions or else I'd have kept my SEO manager job but ok, yes that is the way Google will go. They'll strive to have an in-house answer for any query.

Google flights, google hotels, google shopping, google answers, basically building a close ecosphere like Facebook which seems to have been their obsession these recent years. So basically, Google wants to keep people on Google.com and not send traffic on.

However, even in that example you showed, I think Google are bordering on Copyright abuse since one thing is indexing content and another is passing it off as your own for commercial gain. Eventually the Europeans will make Google split into different companies like Microsoft.

I also think that people will instinctively move away from Google if it gets too commercialized.