Sunday, 25 September 2016

Penguin 4.0 Update

On Friday Google's Gary Illyes announced Penguin 4.0 was now live.

Key points highlighted in their post are:

  • Penguin is a part of their core ranking algorithm
  • Penguin is now real-time, rather than something which periodically refreshes
  • Penguin has shifted from being a sitewide negative ranking factor to a more granular factor

Things not mentioned in the post

  • if it has been tested extensively over the past month
  • if the algorithm is just now rolling out or if it is already done rolling out
  • if the launch of a new version of Penguin rolled into the core ranking algorithm means old sites hit by the older versions of Penguin have recovered or will recover anytime soon

Since the update was announced, the search results have become more stable.

They still may be testing out fine tuning the filters a bit...

...but what exists now is likely to be what sticks for an extended period of time.

Penguin Algorithm Update History

  • Penguin 1: April 24, 2012
  • Penguin 2: May 26, 2012
  • Penguin 3: October 5, 2012
  • Penguin 4: May 22, 2013 (AKA: Penguin 2.0)
  • Penguin 5: October 4, 2013 (AKA Penguin 2.1)
  • Penguin 6: rolling update which began on October 17, 2014 (AKA Penguin 3.0)
  • Penguin 7: September 23, 2016 (AKA Penguin 4.0)

Now that Penguin is baked into Google's core ranking algorithms, no more Penguin updates will be announced. Panda updates stopped being announced last year. Instead we now get unnamed "quality" updates.

Volatility Over the Long Holiday Weekend

Earlier in the month many SEOs saw significant volatility in the search results, beginning ahead of Labor Day weekend with a local search update. The algorithm update observations were dismissed as normal fluctuations in spite of the search results being more volatile than they have been in over 4 years.

There are many reasons for search engineers to want to roll out algorithm updates (or at least test new algorithms) before a long holiday weekend:

  • no media coverage: few journalists on the job & a lack of expectation that the PR team will answer any questions. no official word beyond rumors from self-promotional marketers = no story
  • many SEOs outside of work: few are watching as the algorithms tip their cards.
  • declining search volumes: long holiday weekends generally have less search volume associated with them. Thus anyone who is aggressively investing in SEO may wonder if their site was hit, even if it wasn't.
    The communications conflicts this causes between in-house SEOs and their bosses, as well as between SEO companies and their clients both makes the job of the SEO more miserable and makes the client more likely to pull back on investment, while ensuring the SEO has family issues back home as work ruins their vacation.
  • fresh users: as people travel their search usage changes, thus they have fresh sets of eyes & are doing somewhat different types of searches. This in turn makes their search usage data more dynamic and useful as a feedback mechanism on any changes made to the underlying search relevancy algorithm or search result interface.

Algo Flux Testing Tools

Just about any of the algorithm volatility tools showed far more significant shift earlier in this month than over the past few days.

Take your pick: Mozcast, RankRanger, SERPmetrics, Algaroo, Ayima Pulse, AWR, Accuranker, SERP Watch & the results came out something like this graph from Rank Ranger:

One issue with looking at any of the indexes is the rank shifts tend to be far more dramatic as you move away from the top 3 or 4 search results, so the algorithm volatility scores are much higher than the actual shifts in search traffic (the least volatile rankings are also the ones with the most usage data & ranking signals associated with them, so the top results for those terms tend to be quite stable outside of verticals like news).

You can use AWR's flux tracker to see how volatility is higher across the top 20 or top 50 results than it is across the top 10 results.

Example Ranking Shifts

I shut down our membership site in April & spend most of my time reading books & news to figure out what's next after search, but a couple legacy clients I am winding down working with still have me tracking a few keywords & one of the terms saw a lot of smaller sites (in terms of brand awareness) repeatedly slide and recover over the past month.

Notice how a number of sites would spike down on the same day & then back up. And then the pattern would repeat.

As a comparison, here is that chart over the past 3 months.

Notice the big ranking moves which became common over the past month were not common the 2 months prior.

Negative SEO Was Real

There is a weird sect of alleged SEOs which believes Google is omniscient, algorithmic false positives are largely a myth, AND negative SEO was never a real thing.

As it turns out, negative SEO was real, which likely played a part in Google taking years to rolll out this Penguin update AND changing how they process Penguin from a sitewide negative factor to something more granular.

Update != Penalty Recovery

Part of the reason many people think there was no Penguin update or responded to the update with "that's it?" is because few sites which were hit in the past recovered relative to the number of sites which ranked well until recently just got clipped by this algorithm update.

When Google updates algorithms or refreshes data it does not mean sites which were previously penalized will immediately rank again.

Some penalties (absent direct Google investment or nasty public relations blowback for Google) require a set amount of time to pass before recovery is even possible.

Google has no incentive to allow a broad-based set of penalty recoveries on the same day they announce a new "better than ever" spam fighting algorithm.

They'll let some time base before the penalized sites can recover.

Further, many of the sites which were hit years ago & remain penalized have been so defunded for so long that they've accumulated other penalties due to things like tightening anchor text filters, poor user experience metrics, ad heavy layouts, link rot & neglect.

What to do?

So here are some of the obvious algorithmic holes left by the new Penguin approach...

  • only kidding
  • not sure that would even be a valid mindset in the current market
  • hell, the whole ecosystem is built on quicksand

The trite advice is to make quality content, focus on the user, and build a strong brand.

But you can do all of those well enough that you change the political landscape yet still lose money.

Google & Facebook are in a cold war, competing to see who can kill the open web faster, using each other as justification for their own predation.

Even some of the top brands in big money verticals which were known as the canonical examples of SEO success stories are seeing revenue hits and getting squeezed out of the search ecosystem.

And that is without getting hit by a penalty.

It is getting harder to win in search period.

And it is getting almost impossible to win in search by focusing on search as an isolated channel.

Efforts and investments in chasing the algorithms in isolation are getting less viable by the day.

Anyone operating at scale chasing SEO with automation is likely to step into a trap.

When it happens, that player better have some serious savings or some non-Google revenues, because even with "instant" algorithm updates you can go months or years on reduced revenues waiting for an update.

And if the bulk of your marketing spend while penalized is spent on undoing past marketing spend (rather than building awareness in other channels outside of search) you can almost guarantee that business is dead.

"If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits." - Matt Cutts

Categories: 


from SEO Book http://ift.tt/2daK50j
via IFTTThttp://ift.tt/1ZR0zs6

Thursday, 15 September 2016

Free Google AdWords Keyword Suggestion Tool Alternative

Google recently made it much harder to receive accurate keyword data from the AdWords keyword tool.

They have not only grouped similar terms, but then they broadened out the data ranges to absurdly wide ranges like 10,000 to 100,000 searches a month. Only active AdWords advertisers receive (somewhat?) decent keyword data. And even with that, there are limitations. Try to view too many terms and you get:

"You’ve reached the maximum number of page views for this day. This page now shows ranges for search volumes. For a more detailed view, check back in 24 hours."

Jennifer Slegg shared a quote from an AdWords advertiser who spoke with a representative:

"I have just spoken to a customer service manger from the Australia support help desk. They have advised me that there must be continuous activity in your google ad-words campaign (clicks and campaigns running) for a minimum of 3-4 months continuous in order to gain focused keyword results. If you are seeing a range 10-100 or 100-1k or 1k -10k its likely your adwords account does not have an active campaign or has not had continuous campaigns or clicks."

So you not only need to be an advertiser, but you need to stay active for a quarter-year to a third of a year to get decent data.

Part of the sales pitch of AdWords/PPC was that you can see performance data right away, whereas SEO investments can take months or years to back out.

But with Google outright hiding keyword data even from active advertisers, it is probably easier and more productive for those advertisers to start elsewhere.

There are many other keyword data providers (Wordtracker, SEMrush, Wordze, Spyfu, KeywordSpy, Keyword Discovery, Moz, Compete.com, SimilarWeb, Xedant, Ubersuggest, KeywordTool.io, etc.) And there are newer entrants like the Keyword Keg Firefox extension & the brilliantly named KeywordShitter).

In light of Google's push to help make the web more closed-off & further tilt the web away from the interests of searchers toward the interest of big advertisers*, we decided to do the opposite & recently upgraded our keyword tool to add the following features...

  • expanded the results per search to 500
  • we added negative match and modified broad match to the keyword export spreadsheet (along with already having phrase, broad & exact match)

Our keyword tool lists estimated search volumes, bid prices, cross links to SERPs, etc. Using it does require free account registration to use, but it is a one-time registration and the tool is free. And we don't collect phone numbers, hard sell over the phone, etc. We even shut down our paid members area, so you are not likely to receive any marketing messages from us anytime soon.

Export is lightning quick AND, more importantly, we have a panda in our logo!

Here is what the web interface looks like

And here is an screenshot of data in Excel with the various keyword match types

If the tool looks like it is getting decent usage, we may upgrade it further to refresh the data more frequently, consider adding more languages, add a few more reference links to related niche sites in the footer cross-reference section, and maybe add a few other features.

"Every market has some rules and boundaries that restrict freedom of choice. A market looks free only because we so unconditionally accept its underlying restrictions that we fail to see them."Ha-Joon Chang

Categories: 


from SEO Book http://ift.tt/2cu8rkh
via IFTTThttp://ift.tt/1ZR0zs6

Tuesday, 9 August 2016

How I Learned to Start Loving Social Media's Darkside

I'm baaaaaaack.

Organic Listings

What a fun past couple years it has been in the digital marketing landscape; we've seen hummingbirds, ads displacing organic listings, phantoms, ads displacing organic listings, rank brain, and of course ads displacing organic listings. It has been such a long time since my last post that back when I was last writing for SEObook we were still believing in the timelines provided by Google employees on when Penguin was going to run next. Remember that? Oh, the memories.

Idiot Proof SEO Concepts You Better Not Screw Up For Me

The reason I'm back is to share a tip. Normally I don't share SEO tips because by sharing information on a tactic, I end up burning the tactic and killing whatever potential usable market value remained on its shelf life. Why share then? Because this isn't something you can kill; it involves people. And killing people is bad. To explain how it works though, I need to explain the two concepts I'm smashing together like chocolate and peanut butter.

Keepin' it REAL.

Chocolate

The chocolate, aka Influencer Marketing – my definition of influencer marketing is having someone tell your story for you. Some people view influencer marketing as paying someone like Kim Kardashian $50,000 to post a picture of herself on Instagram holding a sample of your new line of kosher pickles. While that does fit under my definition as well, I consider that aspirational influencer marketing since her audience is primarily comprised of being aspiring to be Kim. Also equally valid is having Sally your foodie neighbor posting that picture in exchange for getting a free jar of those delicious pickles; in this particular case though the influence would be considered peer level influence since Sally's audience is going to be comprised largely of people that view Sally as their equal, and possibly recognize that Sally as a foodie knows her food. Personally, I am biased, but I prefer lots of peer influence campaigns than a single big budget aspirational influence campaign, but I digress. If you want to learn a lot more about differences in the campaign types, I spoke with Bronco on the ins and outs of influence.

Peanut Butter

The peanut butter, aka Online Reputation Management, aka ORM – while I would hope reputation management doesn't need to be specifically defined, I'll define it anyhow as changing the online landscape for the benefit of a client's (or your own) reputation. Peanut butter is a really good analogy for ORM because a lot of work gets spread around in a lot of directions, from creating hundreds of thousands of properties designed to flood the SERPs and social channels as a tail that wags the dog, to straight up negative SEO. Yeah, I said it. If negative SEO wasn't made so much more available due to Panda, Penguin, and the philosophical neative a priori shift, in ORM would not be the industry that it is today.

So what's the tip? You can combine these two concepts for your clients, and you can do it in a variety of different ways. Let's walk through a few…

POSITIVE/BENIGN Focus

  1. Use aspirational influence to find a blogger/writer to talk about your client or product.
  2. Use peer influence indirectly to let a more difficult to approach blogger/writer “discover” your client and write about him or her.
  3. Use aspirational influence as a means to gain links to some properties. Seriously, this works really well. Some audiences will write a series of articles on whatever certain individuals writes about.
  4. Use peer influence to change tone/meaning of a negative article to something more benign.
  5. Use peer influence to find bloggers/writers to discuss concepts that can only be disucssed by referencing you or your client.

NEGATIVE Focus

  1. Use peer pressure influence to get material removed.
  2. Use aspirational influence to change the mind of blogger/writer (think politics – this works).
  3. Use peer influence to change links from one target to another in source material (this occurs quite a bit on Wikipedia too).
  4. THE TRUMP® CARD©: Use aspirational influence and peer influence in combination, which I call compulsion marketing, to inspire frightening movements and witchunts (coordinated DOS attacks, protests, crap link blasts, et al).

My business partner at my influencer marketing network Intellifluence, Terry Godier, and I also refer to some of the above topics under the umbrella of dark influence. I'm sure this list isn't even close to exhaustive, mainly because I don't want to go too deep on how scary one can get. If you need to address such things, I still take on select ORM clients at Digital Heretix and can help you out or refer you to a quality professional that will. Combining concepts and tactics is often a lot more fun than trying to approach a tactic singularly; when possible, work in multiple dimensions.

Think of a way that I missed or some cool concepts that could be paired to be more powerful? Let me know on Twitter.

Cheers,
Joe Sinkwitz

Categories: 


from SEO Book http://ift.tt/2aQxk9X
via IFTTThttp://ift.tt/1ZR0zs6

Facebook's Panda Update

So far this year publishers have lost 52% their Facebook distribution due to:

Instant Articles may have worked for an instant, but many publishers are likely where they were before they made the Faustian bargain, except they now have less control over their content distribution and advertising while having the higher cost structure of supporting another content format.

When Facebook announced their news feed update to fight off clickbait headlines, it sure sounded a lot like the equivalent of Google's Panda update. Glenn Gabe is one of the sharpest guys in the SEO field who regularly publishes insightful content & doesn't blindly shill for the various platform monopolies dominating the online publishing industry & he had the same view I did.

Further cementing the "this is Panda" view was an AdAge article quoting some Facebook-reliant publishers. Glad we have already shifted our ways. Nice to see them moving in the same direction we are. etc. ... It felt like reading a Richard Rosenblatt quote in 2011 about Demand Media's strong working relationship with Google or how right after Panda their aggregate traffic level was flat.

January 27, 2011

Peter Kafka: Do you think that Google post was directed at you in any way?

Richard Rosenblatt: It’s not directed at us in any way.

P K: they wrote this post, which talks about content farms, and even though you say they weren’t talking about you, it left a lot of people scratching their heads.

R R: Let’s just say that we know what they’re trying to do. ... He’s talking about duplicate, non-original content. Every single piece of ours is original. ... our relationship is synergistic, and it’s a great partnership.

May 9, 2011

Kara Swisher: What were you trying to communicate in the call, especially since investors seemed very focused on Panda?

R R: What I also wanted to show was that third-party data sources should not be relied on. We did get affected, for sure. But I was not just being optimistic, we wanted to use that to really understand what we can do better.

K S: Given Google’s shift in its algorithm, are you shifting your distribution, such as toward social and mobile?

R R: If you look at where trends are going, that’s where we are going to be.

K S: How are you changing the continued perception that Demand is a content farm?

R R: I don’t think anyone has defined what a content farm is and I am not sure what it means either. We obviously don’t think we are a content farm and I am not sure we can counter every impact if some people think we are.

A couple years later Richard Rosenblatt left the company.

Since the Google Panda update eHow has removed millions of articles from their site. As a company they remain unprofitable a half-decade later & keep seeing YoY media ad revenue declines in the 30% to 40% range.

Over-reliance on any platform allows that platform to kill you. And, in most cases, you are unlikely to be able to restore your former status until & unless you build influence via other traffic channels:

I think in general, media companies have lost sight of building relationships with their end users that will bring them in directly, as opposed to just posting links on social networks and hoping people will click. I think publishers that do that are shooting themselves in the foot. Media companies in general are way too focused on being where our readers are, as opposed to being so necessary to our readers that they will seek us out. - Jessica Lessin, founder of TheInformation

Recovering former status requires extra investment far above and beyond what led to the penalty. And if the core business model still has the same core problems there is no solution.

"I feel pretty confident about the algorithm on Suite 101." - Matt Cutts

Some big news publishers are trying to leverage video equivalents of a Narrative Science or Automated Insights (from Wochit and Wibbitz) to embed thousands of autogenerated autoplay videos in their articles daily.

But is that a real long-term solution to turn the corner? Even if they see a short term pop in ad revenues by using some dumbed-down AI-enhanced low cost content, all that really does is teach people that they are a source of noise while increasing the number of web users who install ad blockers.

And the whole time penalized publishers try to recover the old position of glory, the platform monopolies are boosting their AI skills in the background while they eat the playing field.

The companies which run the primary ad networks can easily get around the ad blockers, but third party publishers can't. As the monopoly platforms broadly defund ad-based publishing, they can put users "in control" while speaking about taking the principle-based approach:

“This isn’t motivated by inventory; it’s not an opportunity for Facebook from that perspective,” Mr. Bosworth said. “We’re doing it more for the principle of the thing. We want to help lead the discussion on this.” ... Mr. Bosworth said Facebook hasn't paid any ad-blocking software company to have its ads pass through their filters and that it doesn’t intend to.

Google recently worked out a deal with Wikimedia to actually cite the source of the content shown in the search results:

it hasn’t always been the easiest to see that the material came from Wikipedia while on mobile devices. At the Wikimedia Foundation, we’ve been working to change that.

While the various platforms ride the edge on what is considered reasonable disclosure, regulatory bodies crack down on individuals participating on those platforms unless they are far more transparent than the platforms are:

Users need to be clear when they're getting paid to promote something, and hashtags like #ad, #sp, #sponsored --common forms of identification-- are not always enough.

The whole "eating the playing field" is a trend which is vastly under-reported, largely because almost everyone engaged in the ecosystem needs to sell they have some growth strategy.

The reality is as the platform gets eaten it only gets harder to build a sustainable business. The mobile search interface is literally nothing but ads in most key categories. More ads. Larger ads. Nothing but ads.

And a bit of scrape after the ads to ensure the second or third screen still shows zero organic results.

And more scraping, across more categories.

What's more, even large scaled companies in big money fields are struggling to monetize mobile users. On the most recent quarterly conference call TripAdvisor executives stated they monetize mobile users at about 30% the rate they monetize desktop or tablet users.

What happens when the big brand advertisers stop believing in the narrative of the value of precise user tracking?

We may soon find out:

P&G two years ago tried targeting ads for its Febreze air freshener at pet owners and households with large families. The brand found that sales stagnated during the effort, but rose when the campaign on Facebook and elsewhere was expanded last March to include anyone over 18.
...
P&G’s push to find broader reach with its advertising is also evident in the company’s recent increases in television spending. Toward the end of last year P&G began moving more money back into television, according to people familiar with the matter.

For mobile to work well you need to be a destination & a habit. But there is tiny screen space and navigational searches are also re-routed through Google hosted content (which will, of course, get monetized).

In fact, what would happen to an advertiser if they partnered with other advertisers to prevent brand bidding? Why that advertiser would get sued by the FTC for limiting user choice:

The bidding agreements harm consumers, according to the complaint, by restraining competition for, and distorting the prices of, advertising in relevant online auctions, by reducing the number of relevant, useful, truthful and non-misleading advertisements, by restraining competition among online sellers of contact lenses, and in some cases, by resulting in consumers paying higher retail prices for contact lenses.

If the above restraint of competition & market distortion is worth suing over, how exactly can Google make the mobile interface AMP exclusive without earning a similar lawsuit?

AMP content presented in the both sections will be “de-duplicated” in order to avoid redundancies, Google says. The move is significant in that AMP results will now take up an entire phone screen, based on the example Google shows in its pitch deck.

Are many publishers in a rush to support Google AMP after the bait-n-switch on Facebook Instant Articles?

Categories: 


from SEO Book http://ift.tt/2aXBj3I
via IFTTThttp://ift.tt/1ZR0zs6

Sunday, 31 July 2016

Brands Beat Generics

When markets are new they are unproven, thus they often have limited investment targeting them.

That in turn means it can be easy to win in new markets just by virtue of existing.

It wouldn't be hard to rank well creating a blog today about the evolution of the 3D printing industry, or a how to site focused on Arduino or Raspberry Pi devices.

Couple a bit of passion with significant effort & limited competition and winning is quite easy.

Likewise in a small niche geographic market one can easily win with a generic, because the location acts as a market filter which limits competition.

But as markets age and become more proven, capital rushes in, which pushes out most of the generic unbranded players.

Back in 2011 I wrote about how Google had effectively killed the concept of category killer domains through the combination of ad displacement, vertical search & the algorithmic ranking shift moving away from relevancy toward awareness. 2 months before I wrote that post Walgreen Co. acquired Drugstore.com for about $429 million. At the time Drugstore.com was one of the top 10 biggest ecommerce pure plays.

Thursday Walgreens Boots announced it would shut down Drugstore.com & Beauty.com:

The company is still trying to fine tune its e-commerce strategy but clearly wants to focus more of its resources on one main site. “They want to make sure they can invest more of the equity in Walgreens.com,” said Brian Owens, a director at the consultancy Kantar Retail. “Drugstore.com and Beauty.com are distractions.”

Big brands can sometimes get coverage of "meh" content by virtue of being associated with a big brand, but when they buy out pure-play secondary e-commerce sites those often fail to gain traction and get shuttered:

Other retailers have picked up pure-play e-commerce sites, only to shut them down shortly thereafter. Target Corp. last year shuttered ChefsCatalog.com and Cooking.com, less than three years after buying them.

The lack of publishing savvy among most large retailers mean there will be a water cycle of opportunity which keeps re-appearing, however as the web gets more saturated many of these opportunities are going to become increasingly niche options riding new market trends.

If you invest in zero-sum markets there needs to be some point of differentiation to drive switching. There might be opportunity for a cooking.com or a drugstore.com targeting emerging and frontier markets where brands are under-represented online (much like launching Drugstore.com in the US back in 1999), but it is unlikely pure-play ecommerce sites will be able to win in established markets if they use generically descriptive domains which make building brand awareness and perceived differentiation next to impossible.

Target not only shut down cooking.com, but they didn't even bother redirecting the domain name to an associated part of their website.

It is now listed for sale.

Many short & generic domain names are guaranteed to remain in a purgatory status.

  • The price point is typically far too high for a passionate hobbyist to buy them & attempt to turn them into something differentiated.
  • The names are too generic for a bigger company to do much with them as a secondary option
    • the search relevancy & social discovery algorithms are moving away from generic toward brand
    • retailers have to save their best ideas for their main branded site
    • the rise of cross-device tracking + ad retargeting further incentivize them to focus exclusively on a single bigger site)
Categories: 


from SEO Book http://ift.tt/2aH5IGb
via IFTTThttp://ift.tt/1ZR0zs6

Thursday, 12 May 2016

Reinventing SEO

Back in the Day...

If you are new to SEO it is hard to appreciate how easy SEO was say 6 to 8 years ago.

Almost everything worked quickly, cheaply, and predictably.

Go back a few years earlier and you could rank a site without even looking at it. :D

Links, links, links.

Meritocracy to Something Different

Back then sharing SEO information acted like a meritocracy. If you had something fantastic to share & it worked great you were rewarded. Sure you gave away some of your competitive advantage by sharing it publicly, but you would get links and mentions and recommendations.

These days most of the best minds in SEO don't blog often. And some of the authors who frequently publish literally everywhere are a series of ghostwriters.

Further, most of the sharing has shifted to channels like Twitter, where the half-life of the share is maybe a couple hours.

Yet if you share something which causes search engineers to change their relevancy algorithms in response the half-life of that algorithm shift can last years or maybe even decades.

Investing Big

These days breaking in can be much harder. I see some sites with over 1,000 high quality links that are 3 or 4 months old which have clearly invested deep into 6 figures which appear to be getting about 80 organic search visitors a month.

From a short enough timeframe it appears nothing works, even if you are using a system which has worked, should work, and is currently working on other existing & trusted projects.

Time delays have an amazing impact on our perceptions and how our reward circuitry is wired.

Most the types of people who have the confidence and knowledge to invest deep into 6 figures on a brand new project aren't creating "how to" SEO information and giving it away free. Doing so would only harm their earnings and lower their competitive advantage.

Derivatives, Amplifications & Omissions

Most of the info created about SEO today is derivative (people who write about SEO but don't practice it) or people overstating the risks and claiming x and y and z don't work, can't work, and will never work.

And then from there you get the derivative amplifications of don't, can't, won't.

And then there are people who read and old blog post about how things were x years ago and write as though everything is still the same.

Measuring the Risks

If you are using lagging knowledge from derivative "experts" to drive strategy you are most likely going to lose money.

  • First, if you are investing in conventional wisdom then there is little competitive advantage to that investment.
  • Secondly, as techniques become more widespread and widely advocated Google is more likely to step in and punish those who use those strategies.
  • It is when the strategy is most widely used and seems safest that both the risk is at its peak while the rewards are de minimus.

With all the misinformation, how do you find out what works?

Testing

You can pay for good advice. But most people don't want to do that, they'd rather lose. ;)

The other option is to do your own testing. Then when you find out somewhere where conventional wisdom is wrong, invest aggressively.

"To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. Most large organizations embrace the idea of invention, but are not willing to suffer the string of failed experiments necessary to get there. Outsized returns often come from betting against conventional wisdom, and conventional wisdom is usually right." - Jeff Bezos

That doesn't mean you should try to go against consensus view everywhere, but wherever you are investing the most it makes sense to invest in something that is either hard for others to do or something others wouldn't consider doing. That is how you stand out & differentiate.

But to do your own testing you need to have a number of sites. If you have one site that means everything to you and you get wildly experimental then the first time one of those tests goes astray you're hosed.

False Positives

And, even if you do nothing wrong, if you don't build up a stash of savings you can still get screwed by a false positive. Even having a connection in Google may not be enough to overcome a false positive.

Cutts said, “Oh yeah, I think you’re ensnared in this update. I see a couple weird things. But sit tight, and in a month or two we’ll re-index you and everything will be fine.” Then like an idiot, I made some changes but just waited and waited. I didn’t want to bother him because he’s kind of a famous person to me and I didn’t want to waste his time. At the time Google paid someone to answer his email. Crazy, right? He just got thousands and thousands of messages a day.

I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”

“How did you go bankrupt?"
Two ways. Gradually, then suddenly.”
― Ernest Hemingway, The Sun Also Rises

True Positives

A lot of SEMrush charts look like the following

What happened there?

Well, obviously that site stopped ranking.

But why?

You can't be certain why without doing some investigation. And even then you can never be 100% certain, because you are dealing with a black box.

That said, there are constant shifts in the algorithms across regions and across time.

Paraphrasing quite a bit here, but in this video Search Quality Senior Strategist at Google Andrey Lipattsev suggested...

He also explained the hole Google has in their Arabic index, with spam being much more effective there due to there being little useful content to index and rank & Google modeling their ranking algorithms largely based on publishing strategies in the western world. Fixing many of these holes is also less of a priority because they view evolving with mobile friendly, AMP, etc. as being a higher priority. They algorithmically ignore many localized issues & try to clean up some aspects of that manually. But even whoever is winning by the spam stuff at the moment might not only lose due to an algorithm update or manual clean up, but once Google has something great to rank there it will eventually win, displacing some of the older spam on a near permanent basis. The new entrant raises the barrier to entry for the lower-quality stuff that was winning via sketchy means.

Over time the relevancy algorithms shift. As new ingredients get added to the algorithms & old ingredients get used in new ways it doesn't mean that a site which once ranked

  • deserved to rank
  • will keep on ranking

In fact, sites which don't get a constant stream of effort & investment are more likely to slide than have their rankings sustained.

The above SEMchart is for a site which uses the following as their header graphic

When there is literally no competition and the algorithms are weak, something like that can rank.

But if Google looks at how well people respond to what is in the result set, a site as ugly as that is going nowhere fast.

Further, a site like that would struggle to get any quality inbound links or shares.

If nobody reads it then nobody will share it.

The content on the page could be Pulitzer prize level writing and few would take it seriously.

With that design, death is certain in many markets.

Many Ways to Become Outmoded

The above ugly header design with no taste and a really dumb condescending image is one way to lose. But there are also many other ways.

Excessive keyword repetition like the footer with the phrase repeated 100 times.

Excessive focus on monetization to where most visitors quickly bounce back to the search results to click on a different listing.

Ignoring the growing impact of mobile.

Blowing out the content footprint with pagination and tons of lower quality backfill content.

Stale content full of outdated information and broken links.

A lack of investment in new content creation AND promotion.

Aggressive link anchor text combined with low quality links.

Investing in Other Channels

The harder & more expensive Google makes it to enter the search channel the greater incentive there is to spend elsewhere.

Why is Facebook doing so well? In part because Google did the search equivalent to what Yahoo! did with their web portal. The rich diversity in the tail was sacrificed to send users down well worn paths. If Google doesn't want to rank smaller sites, their associated algorithmic biases mean Facebook and Amazon.com rank better, thus perhaps it makes more sense to play on those platforms & get Google traffic as a free throw-in.

Of course aggregate stats are useless and what really matters is what works for your business. Some may find Snapchat, Instagram, Pinterest or even long forgotten StumbleUpon as solid traffic drivers. Other sites might do well with an email newsletter and exposure on Twitter.

Each bit of exposure (anywhere) leads to further awareness. Which can in turn bleed into aggregate search performance.

People can't explicitly look for you in a differentiated way unless they are already aware you exist.

Some amount of remarketing can make sense because it helps elevate the perceived status of the site, so long as it is not overdone. However if you are selling a product the customer already bought or you are marketing to marketers there is a good chance such investments will be money wasted while you alienate pas

Years ago people complained about an SEO site being far too aggressive with ad retargeting. And while surfing today I saw that same site running retargeting ads to where you can't scroll down the page enough to have their ad disappear before seeing their ad once again.

If you don't have awareness in channels other than search it is easy to get hit by an algorithm update if you rank in competitive markets, particularly if you managed to do so via some means which is the equivalent of, erm, stuffing the ballot box.

And if you get hit and then immediately run off to do disavows and link removals, and then only market your business in ways that are passively driven & tied to SEO you'll likely stay penalized in a long, long time.

While waiting for an update, you may find you are Waiting for Godot.

Categories: 


from SEO Book http://ift.tt/1XmPEIJ
via IFTTThttp://ift.tt/1ZR0zs6

Wednesday, 11 May 2016

Google Rethinking Payday Loans & Doorway Pages?

Nov 12, 2013 WSJ: Google Ventures Backs LendUp to Rethink Payday Loans

Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”

What sort of strategy is helping to drive that industry transformation?

How about doorway pages.

That in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.

March 16, 2015 Google To Launch New Doorway Page Penalty Algorithm

Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.

These sorts of doorway pages are still live to this day.

Simply look at the footer area of http://ift.tt/1rXbGWd

But the pages existing doesn't mean they rank.

For that let's head over to SEMrush and search for LendUp.com

Hot damn, they have almost 10,000 "payday" related keywords they rank for.

And you know their search traffic is only going to increase now that competitors are getting scrubbed from the marketplace.

Today we get journalists conduits for Google's public relations efforts writing headlines like: Google: Payday Loans Are Too Harmful to Advertise.

Today those sorts of stories are literally everywhere.

Tomorrow the story will be over.

And when it is.

Precisely zero journalists will have covered the above contrasting behaviors.

As they weren't in the press release.

Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp's lead in re-branding their offers as being something else in name.

Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.

Don't expect to see a link to this blog post on TechCrunch.

There you'll read some hard-hitting cutting edge tech news like:

Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.

#MomentOfZeroTruth #ZMOT

Categories: 


from SEO Book http://ift.tt/1T6Fhnd
via IFTTThttp://ift.tt/1ZR0zs6