Google Update Jagger: What's it all about?
I have to admit I don't pay much attention to the Google updates, PR, or any of that kind of stuff. It's enough for me to add my two articles per week and a bit more fresh content without keeping track of all the algo changes too.
But I have heard a lot about Google's most recent update, Jagger (and apparently it has at least three stages so far). Most people are complaining bitterly about losing an astounding amount of ground in the SERPS, but where are all the people who are winning? I am, for one. My site has done nothing but go up in the SERPS for the past year at least, and now ranks very well in Google for one of my main keywords. Jagger appears to have moved us up in the SERPS something like 20+ positions. And all this is without spam, without any blackhat SEO, or really any SEO at all (beyond a source-ordered layout, halfways-decent page titles and relevant content). Other than checking my own results about once a month, I don't use Google for anything other than "real" searches - that is to say, I do most searches for purposes other than monitoring specific keywords. Since Jagger started, I haven't really noticed any extra relevancy, but neither have I noticed any negative changes to the results. In other words, as a normal searcher, the Jagger update has been pretty much transparent to me. So anyway, those are my rambling and disjointed thoughts on Jagger. ;) I'd be interested in hearing from someone who actually knows what's going on - what's the main thrust of Jagger, what have been the implications for spam, etc., and what types of sites have been treated nicely by this update? Back to top |
|||||
My stuff is up a bit overall. The "complaining bitterly" folks are those who were cheating anyway....
For an interesting take, see link 1 and link 2 Back to top |
|||||
Matthew, your experience mirrors mine, as well as a lot of others.
Whining versus reality Further, when I've read people complaining about their site dropping, you'll notice one thing: they never talk about their past link building activity. They float around in a complete state of denial about having done things like link exchanges, reciprocal linking, and so on, and will even go to the extreme of saying that their site is completely white hat even though they have engaged in such activities. The main problem many webmasters have is that they are simply not willing to do what you are doing: slowly, patiently, build up a quality site, with unique, quality content, that had not existed on the web in the form you put it up until you made it and posted it. Since this is MUCH harder than most people realize, almost everyone opts for shortcuts, and then are outraged when their work gets knocked off. NOTE: there are a very small number of sites that will get punished unfairly in each update that do create valuable unique content, that's because Google insists on maintaining maximum automation in their systems, and automation will always make mistakes when trying to determine the intentialality of the site in question.. Almost invariably, webmasters have gotten both greedy and impatient. I've talked to a lot of people who saw drops in Jagger, and almost ALL of them have engaged in some type of link generation scheme. So you have to take this into account when reading all that whining: almost none of the whiners are nearly as white hat as they believe themselves to be. There are a few, here and there, but even then I'd really like to take a look at their real backlink structures with my own eyes before I really believe them. So what is Jagger? At this point, it's all speculation, but there are some key things to keep in mind:
The bourbon update was called the 'sloppy webmastering' update by some. We found this to be true, when we dropped there, we fixed the sloppy webmastering errors and the site came back in days. The people who were in denial about their sloppy webmastering are probably still scratching their heads wondering why google hates them. Lots of egos in the way in these updates, they do no good. Between bourbon and jagger there were two smaller updates, one targetting directories, the other was harder to pinpoint, but is known as the september 22 update. As you can begin to see, there has not in fact been 'A' single update, but rather a rolling out of something. Best guesses have this rollout beginning in November/ December 2004. This is when google's index suddenly jumped from 4 billion to 8 billion. Or as we like to say, this is when google finally merged their 2 primary indexes, thus setting the stage for the supplemental results problems that still plague google to this day. More than one knowledgeable seo has suggested that it is Google's inability or unwillingness to once and for all just dump that old collection of data and replace it with a brand new index may be at the root of some of the major index problems google seems to be having difficulty resolving. It's my opinion that Bourbon may have been one such attempt to do a reindexing of the web, with a new indexing engine. Several signs point very strongly to this, such as some new url indexing bugs that appeared fairly recently. However, that isn't all that important, what matters is what we are looking at now, with Jagger. So what are we looking at with Jagger 3: the resolution? It's still too early to tell, but this is what it's looking like: first of all, read the patent application, this will make much more sense if you do. Google is now using a much more powerful, but of course harder to implement, and control, algorithm. I think of it as a control panel with many knobs, although that's probably an oversimplication. This control panel has many options, each main grouping corresponding to the main points laid out in the application, along with possibly some others, in module form, such as Page Rank and Hilltop, which may still be used to generate what are known as trustrankings for sites. Rather than relying on fairly simple backlink and on page factors, google is now looking for stronger signs of authority for any site. This is not perfect, as a quick check through competive keyword categories will show. Especially on the latest data centers. The main new component is the historical one, and the increasing emphasis on trust of the site. The historical component makes the detection of link farm activity much easier. Not perfect, mind you, but easier. The less competent or resourceful the site, the less likely they can crack into the top ten currently. It's a bad time for bad/cheap/scumbag seos, in other words. But oddly enough, it's become a very good time for high quality seo. By looking very carefully at how backlinks appear, and on which sites, Google can attempt to determine if these links are created for seo purposes or if they are real. One sign of a real link is when a site, let's call it known-good-a, links to another site, call it might-be-good-b. This link tells google that the might-be-good-b site is looking closer to being known-good-b. the better the pool of known good sites is, the more accurate this will be. And the less likely scum seos can get a link from one of those. That, by the way, is one major reason you want to dump bad forum links as quickly as humanly possible. You want to let google know two things: first, that no known bad sites are linked to, ever. And second, that if such a link appears, it also disappears very quickly, it is not a permalink. What history are they talking about They're talking about the history of all pages they have in their index, and have had. Each page has two primary historical components: a: the main content, broken down into simple chunks [in other words, google dumps the little words and just watches the big ones] b: the outbound links on the page. An outbound link that stays is better than an outbound link that vanishes. One sign of link farms is dropping outbound links all the time when the link partner removes theirs, or goes out of business. It's open to question how much of this google has implemented currently, my guess is the simplest parts, but probably not the hardest parts. My further guess is that Jagger is the attempt to bring this implementation online. In other words, Jagger is not an update, jagger is the new google. More or less. Logically, if you look at just how long it's been, almost 4 weeks from the very first reports of Jagger 1, before it was named, this is not an update timeline, it's much bigger. As others have pointed out to me, this type of flux has been happening since around December 2004, which is in my opinion when the new systems were rolled out, although probably still emulating the old system. Note, being wrong or right about this doesn't matter all that much, what does matter is having an understanding of what google is looking for. And as you can tell from your site, what google is looking for is your site, and sites like it. Sites like that tended to see very little change from Jagger, if any. But sites with seo work done on them, for a long time, saw very large changes. It's not cut and dry, some supposedly clean sites dropped, many heavily seoed sites stayed high. But those heavily seoed sites tended to have very expensive backlinks, from high PR pages, and they had more backlinks. Which shows that seo still works, it's just not as easy. And those site owners cannot rest quietly, if they don't know that google can detect them, they are deluding themselves. Google is trying to learn and automate as many seo spam methods as possible. This is what accounts for an unprecedented wave of requests for spam reports from Google, both on Matt Cutts and on the webmaster forums where Googleguy makes his presence known. The purpose was and is obvious, google always wants to automate spam detection, but it's hard, so they need samples of spam to feed their filters the data they need to better detect it. Once the low end spam is detected, the higher end spam rose, then reports come in on that, and Google can work at getting better control of their spam detection. Note, Google's aim right now does not appear to be to get rid of all spam, but rather to learn HOW to get rid of all spam. However, early test results I've run show that Google's antispam measures have knocked huge numbers of sites out of the top positions. For example, searching for the specifications for a company name + hard drive + specifications actually returned very good results, for the first time in quite a while. No spam to speak of. Real searches versus vegas online poker etc searches You raised another very good point, your real, probably technical searches, are generally very accurate. Why is that when all the webmasters are screaming? First of all, they are all seeing their lovely spam run and positioned sites falling like rocks. Or they are seeing their competitors rise. Or a combination. Again, I have not talked to a single person who was running a commercial site and who saw a drop who had not done some type of seo focused link building at some point in the past. For some reason, people simply cannot grasp the simple idea that doing backlink seo by reciprocal link exchanges, link circles, link directories, etc, is SEO. It doesn't matter what you pretend to yourself. There's nothing wrong with doing this, but you should be aware of what it is you are doing. The degree of self - denial in the webmaster community about these issues rises to almost staggering levels. Like you, I saw no particular change in Google's results during the last updates. Why? Because I almost never do money keyword searches, unless I'm doing research for a client. So what I see is the same technical search results I pretty much always see. Some movement, but overall it stays pretty good. Why? My theory is very simple: I think that the google engineers use google to do technical searches, so no matter what the update is, the first guys who really use it for real searches are using it for technical stuff. Where there are good solutions, or bad solutions. The search should result in as many good solutions as possible. Consider on the other hand online poker. There are no good results, all the sites are garbage, all have lots of seo link building, etc, done. All are members of various link schemes, and alll are basically pure seo products. So who really cares which one ranks above another one? Certainly not the alpha google search testers. And not me either. And even with this in mind, during Jagger, no matter what all the hysterical people said, I thought the money keyword results were very good, in jagger 1, jagger 2, and jagger 3. Luckily jagger 3 relaxed some filters a bit, the results were a bit extreme, but very useful. What do the users think? Scattered all through the patent application are mentions of tracking user behavior. Most of this tracking happens by connecting user cookies with user searches, and then watching what happens when a user clicks on a serp result. Do they come back in x seconds, click on the next? Or do they stay there? That indicates a successful search. I would assume they have some type of metric, where if the user only stays a few seconds, it's a total failure, bad result for that keyword. If they stay a while, then come back, it was a decent result, but not quite what they were looking for. If they don't come back, it was a successful result. In the summer of 2005, google started doing much more intensive user tracking, even tracking users with cookies disabled, which they usually don't do. This indicated a collection of raw user data. To be compared with I assume the Jagger user data they're working on right now. Recent interviews with Google employees demonstrate this understanding: :: Quote :: "If consumers see a perceptible quality difference [with rival search engines], they will disappear," admits Mr Arora. read full thread story As you can see, google is fully aware how delicate this balance is: the serps must be what the searchers are searching for. Period. To believe otherwise is fairly naive. The bottom line: the bottom line is making money: adwords income However, don't let that fool you into thinking that these updates also don't accomplish a very important end result: The less value website owners get from using seo, the more likely they are to use adwords to promote their site. Why after all pay thousands of dollars to get your site top 10 when google can get rid of it by detecting the seo used to place it there when you can just directly pay google to place your site in the adwords results? As you can see from the story I linked to, this strategy is working very well. Google is posting record profits and income. And it's not because they are manipulating their serps to boost adwords, it's because they are working to get rid of seo sites from their top results. This process is amazingly successful. If you'd seen the rise in the top bid prices for money keywords you'd see what I mean. As more and more major companies decide to forego seo and go with a known value, adwords prices get driven further and further up. But only as long as serp results and quality stay very high. While many google watchers tend to put Google on some fairytale pedestal, google must make money. If it doesn't make money, and a lot of it, it will lose the race. MSN will crush it. Yahoo will roll over it. If Google cannot create and maintain a very high level income stream, they will fail. The market will lose interest, shares will plummet - well, they will plummet anyway since they are currently grotesquely overvalued, at levels not seen before the dot com bubble burst. The longer google can keep share prices inflated, the more shares they can sell, and the larger the war chest they can assemble will be. Google is not a charity, it's a business, in a very aggressively competitive industry. Failure to adequately compete will result in their failure. They know this. The only people who do not seem capable of understanding this is a core group of webmasters on various webmaster forums around the web. <added>vkaryl, LOL, you posted as I was posting, I haven't had time to check those links out yet. But I agree, in probably 99% of cases, the complainers all were using dodgy methods, but it's almost impossible to get most of them to admit it. Back to top |
|||||
One interesting thing about the second item vkaryl linked to:
:: Quote :: Put that together with the fact that they expected two more updates from the start and it almost seems plausible to have manipulated the results to show bad sites mixed with known authority sites in an effort to weed out the growing problem of automatic webpage generating, copyright infringing spam sites that are designed to attract clicks to their adSense ads and in return offer useless content to you.I'd tend to agree, but not quite like these guys say, my guess is that simply Google wanted to show users different 'looks', like in football, to see how they would react, that is, which sets and site types generate the best click success. Clearly adsense imitation sites won't do well in that, but neither will many other types of sites. What google wanted to see is what those site types are, and what site types people like, and stay on for those searches. Back to top |
|||||
All times are GMT - 8 Hours
|