Google: URL Length Is Not A Ranking Factor

by Barry Schwartz and first posted on Search Engine Round Table

Google’s John Mueller had to say on Twitter that “URL length is not a ranking fraction.” He was told that a CMS autogenerates URLs that are often longer than what is recommended for SEO. Now, what is recommended for SEO in terms of length?

Maybe for usability, keeping your URLs short and sweet is a good thing. But Google can crawl and process pretty long URLs.

Here is John’s tweet:

Per-Erik Skramstad@perskram

@JohnMu A lot of our automatically generated URLs are longer than most SEOs recommend. It only applies to not very important pages, but I wonder how bad this is in Google’s eyes all the same.

🍌 John 🍌

@JohnMu

URL length is not a ranking fraction.

See 🍌 John 🍌‘s other Tweets

I mean, Google can handle a URLs as long as 2,000 characters or so. Now while Google prefers to show the shorter version of the URL, if available, it doesn’t mean Google cannot index and rank longer URLs. Meaning, if your URL says .com/google-url-length-seo.html?tons-of-parameters-goes-here, Google will likely canonicalize it to .com/google-url-length-seo.html and remove the added on parameters.

New: Google Search Console Removals Tool

by Barry Schwartz and originally posted on Search Engine Roundtable.

Google has just launched a new tool within the new Google Search Console named the removals tool. This tool lets you temporarily block search results from your site, it shows reported via other Google public removal tools (i.e. Outdated content) and you can manage SafeSearch filtering in this tool.

You should be able to access the tool by clicking here – if it doesn’t load for you, then it is still rolling out, give it some time. At 8:28am ET it started to work for me, so hopefully you will see it soon.

Temporary removals

Here is where you can request temporary removals of specific URLs on sites you have verified. Google said there are two types of removal requests available. (1) Temporary remove URL will hide the URL from Google Search results for about six months and clear the cached copy of the page. (2) Clear cache URL clears the cached page and wipes out the page description snippet in Search results until the page is crawled again.

Here are screen shots:

click for full size

click for full size

Outdated content

The Outdated content section shows provides information on removal requests made through the public Remove Outdated Content tool, which can be used by anyone (not just site owners) to update search results showing information that is no longer present on a page. Here is a screen shot:

click for full size

SafeSearch filtering

Often SafeSearch is hard for SEOs and Webmasters to debug. Is my content being blocked because it was tagged as being adult? This tool should help you see that quickly. Google said “The SafeSearch filtering section in Search Console shows a history of pages on your site that were reported by Google users as adult content using the SafeSearch Suggestion tool. URLs submitted using this tool are reviewed, and if Google feels that this content should be filtered from SafeSearch results, these URLs are tagged as adult content.”

Here is a screen shot:

click for full size

Google: Do Not Use Robots.txt To Block Indexing Of URLs With Parameters

Written by Barry Schwartz and first appearing on Search Engine Roundtable

Google’s John Mueller said you should absolutely not “use robots.txt to block indexing of URLs with parameters.” He said if you do that then Google “cannot canonicalize the URLs, and you lose all of the value from links to those pages.” Instead use rel-canonicals and link consistently throughout your site.

John said this on Twitter, here is an embed of the tweet:

RomainP@RomainP29619045

Hello @JohnMu , I see more and more website having pages “indexed despite being blocked by robots.txt”. Any Idea on why or how to stop that? Mainly URL with parameters.

🍌 John 🍌

@JohnMu

Don’t use robots.txt to block indexing of URLs with parameters. If you do that, we can’t canonicalize the URLs, and you lose all of the value from links to those pages. Use rel-canonical, link cleanly internally, etc.

See 🍌 John 🍌‘s other Tweets

He then had a follow up about why it is so bad to block these URLs with robots.txt:

RomainP@RomainP29619045

Thank you for your answer and time. The thing is, on e-commerce websites, filter mean a lots of parameters, so I use both canonical and robots.txt to try not to waste time of bots on tons of pages. Wrong practice?

🍌 John 🍌

@JohnMu

We wouldn’t see the rel-canonical if it’s blocked by robots.txt, so I’d pick either one or the other. If you do use robots.txt, we’ll treat them like other robotted pages (and we won’t know what’s on the page, so we might index the URL without content).

See 🍌 John 🍌‘s other Tweets

So be careful about this and double check all of these things on your web sites.

New Google Search Console Speed Reports Rolling Out

by  and first published on Search Engine Roundtable.

google speed

Google is now rolling out the experimental and new “Speed” reports within Google Search Console. Google began testing this last May and we have seen more and more webmasters get this report since then but now Google announced it is rolling out as “experimental.”

Google said on Twitter “We’re excited to begin the public rollout for the Search Console Speed report.”

Google Webmasters

@googlewmc

We’re excited to begin the public rollout for the Search Console Speed report 💨! Let’s make the web faster together 💪 https://webmasters.googleblog.com/2019/11/search-console-speed-report.html 

View image on Twitter
260 people are talking about this

Here are some more screen shots (I have do roll out my new design…):

click for full size

click for full size

click for full size

I posted a more detailed look at these reports a couple weeks ago at Search Engine Land.

Here is what Google said:

The report classifies URLs by speed and the issue that causes any slowdowns. Drill down on a specific issue to see examples of slow URLs to help you prioritize performance improvements for them. To get a better sense of what type of optimization can be performed for a specific URL, the report links to the Page Speed Insight tool, which provides information on specific optimization opportunities.You should use this report both for monitoring your performance over time and for tracking fixes you made to your website. If you fix an issue, use the report to track whether users experienced a performance improvement when browsing the fixed version of your website.

To help you understand how your site is performing, you can also see what types of URLs are doing better by checking the moderate and fast buckets.

Forum discussion at Twitter.

Google BERT Update Impacts 10% Queries & Has Been Rolling Out All Week

by  and originally appearing on Search Engine Roundtable.

Google BERT Update

This week Google announced that starting earlier this week, throughout the week, Google has been pushing out what it calls “biggest leap forward in the past five years, and one of the biggest leaps forward in the history of search.” That is the BERT algorithm update, which is Google’s way at better understanding one out of ten queries, in a way humans understand them.

I covered it in detail at Search Engine Land but let me sum it up in bullet point fashion for those quick readers:

(1) BERT began rolling out earlier this week and will be fully live by the end of this week. It is live for US English language queries.

(2) It also impacts featured snippets in a big way, and not just for English language featured snippets like with core search but many different languages.

(3) That means it is probably the fluctuations we saw this weekend and mid week that were related to this.

(4) BERT is a lot like RankBrain, in that it is a machine learning algorithm that aims to better understand queries and content on a page.

(5) BERT technically is neural network-based technique for natural language processing (NLP) pre-training. BERT stands for Bidirectional Encoder Representations from Transformers. Google wrote about it last year in more detail over here.

(6) BERT allows Google to understand queries that are more human like, queries and content that are more natural language and conversational based. BERT helps Google understand the nuances and context of words in searches and better match those queries with more relevant results.

(7) Google says this is massive, massive in that it impacts 10% of all queries but also Google said this is its biggest steps forward for search in the past 5 years, and it’s one of the biggest steps forward in the history of search altogether.

(8) We definitely noticed changes this weekend and mid week but did it feel as big as Panda or Penguin (I know they are totally different beasts) – no, it did not. Why? I assume because SEOs can’t optimize for it 🙂 and thus it feels like towards SEOs.

(9) Google said they tested BERT in a big way and the company is seeing seeing significant benefits to using it.

(10) It does not replace RankBrain or other language algorithms, it can be used in conjunction with them.

(11) I assume you cannot optimize for BERT like you cannot optimize for RankBrain – just write for humans.

(12) Neural matching is different also, see this story.

Here are some before and after examples of BERT in action; note, these results may not be live because these screen shots were taken earlier and the search results change rapidly:

click for full size

click for full size

click for full size

click for full size

And a featured snippet example:

click for full size

BERT Doesn’t Feel Huge

Again, the chatter and tracking tools have shown changes this weekend and mid week but again, it is not as significant as a core update or other update – at least from an SEO’s perspective – at least yet. This is based on the signals I track…

Bill Lambert Right Again?

Bill Lambert character seems to be spot on when calling it a game changer and was spot on when saying when this would roll out. Even when I said the chatter isn’t huge, he stuck to his guns and said the update is going on now. Just interesting, don’t you think?

Early SEO Reaction

Here are some tweets:

So far we have very little chatter in the community and Googlers have not responded yet to BERT related questions on Twitter. I did however ask Google numerous questions via email and Google replied to all of them, to write this and my Search Engine Land article.

Google’s SearchLiaison account has tweeted it about it at 10am today:

 

Google’s Paid Search is a ‘sakedown’ says Jason Fried, Basecamp founder and CEO

CNBC: Jason Fried Basecamp @Work event 190402-001

This article originally appeared on CNBC and was written by Cindy Ord for CNBC.

Do a Google search for Basecamp, a web-based project management tool company, and you might see one or more ads for competitors show up in results above the actual company.

Basecamp CEO and co-founder Jason Fried sounded off against the practice Tuesday, calling it a “shakedown” and saying it’s like ransom to have to pay up just to be seen in results.

“When Google puts 4 paid ads ahead of the first organic result for your own brand name, you’re forced to pay up if you want to be found,” he tweeted Tuesday afternoon. “It’s a shakedown. It’s ransom. But at least we can have fun with it. Search for Basecamp and you may see this attached ad.”

Jason Fried

@jasonfried

When Google puts 4 paid ads ahead of the first organic result for your own brand name, you’re forced to pay up if you want to be found. It’s a shakedown. It’s ransom. But at least we can have fun with it. Search for Basecamp and you may see this attached ad.

View image on Twitter

The tweet includes the screenshot of an ad for Basecamp, reading “Basecamp.com | We don’t want to run this ad.” The copy says “We’re the #1 result, but this site lets companies advertise against us using our brand. So here we are. A small, independent co. forced to pay ransom to a giant tech company.”

Fried’s complaint comes as regulators are increasingly scrutinizing Google’s dominance in certain areas, including search and advertising. The Justice Department is reportedly looking into Google’s digital advertising and search operations as authorities prepare an antitrust review of tech giants’ market power, and more than 30 states are considering their own antitrust investigations. The company could stand to face billions of dollars in fines, as it from competition authorities in Europe, or even be forced to spin off business units like YouTube.

In an interview with CNBC, Fried said the company hadn’t previously advertised on Google but started doing so since Basecamp would sometimes show up fifth in search results under advertisements, “even though we’re the first organic result and it’s our brand.”

For instance, when searching for Basecamp on Google, a user might see an ad for Monday.com positioning itself as a Basecamp alternative. Monday.com didn’t immediately respond to a request for comment.

This practice, called “conquesting,” is a common way for brands to show up when potential customers search for a competitor, and is common on many different platforms other than Google — for instance, if you search for one brand on Amazon, you might see a slew of products from other brands before you find what you were searching for.

Fried also said it can be tough for the average consumer to discern whether a listing is an ad since Google’s “Ad” qualifier is so small.

“It’s so easy to miss,” he said. “The ads look more and more like organic results.’”

“It just seems completely unfair,” Fried said. “You basically have to pay protection money to Google to even have a chance.” He said the company is running the ad to stand up for small businesses having these kinds of problems with Google.

Fried also said the company has filed complaints about trademark violations with Google for ads that use Basecamp’s name, but said ads of that kind keep popping up.

In a statement, a Google spokeswoman said the company prohibits the use of trademarked terms in the text of an ad if the owner files a complaint. “Our trademark policy balances the interests of users, advertisers and trademark owners, ” the statement said. “To provide users with the most relevant ads, we don’t restrict trademarked terms as keywords. We do, however, restrict trademarked terms in ad text if the trademark owner files a complaint.”

The CEO of Shopify, Tobias Lutke, shared and weighed in on Fried’s tweet.

“It’s totally crazy for google to get away with charging what’s basically protection money on your own brand name,” he wrote. ”‘Nice high intend traffic you got there, would be a shame if something were to happen to it.’”

Tobi Lütke

@tobi

It’s totally crazy for google to get away with charging what’s basically protection money on your own brand name. “Nice high intend traffic you got there, would be a shame if something were to happen to it” https://twitter.com/jasonfried/status/1168986962704982016 

Jason Fried

@jasonfried

When Google puts 4 paid ads ahead of the first organic result for your own brand name, you’re forced to pay up if you want to be found. It’s a shakedown. It’s ransom. But at least we can have fun with it. Search for Basecamp and you may see this attached ad.

View image on Twitter

Other companies in recent weeks have discussed issues with Google ads. IAC said in early August it had seen an unanticipated increase in the cost of customers from Google, its largest source of traffic. The company said “users arriving through paid search results were up substantially, but considerably more expensive.”

Booking Holdings, the parent of Booking.com, Priceline and Kayak, has also long counted on Google for traffic, spending billions of dollars a year on ads and search engine optimization. But last month the company said it has “observed a long-term trend of decreasing performance marketing returns on investment (‘ROIs’), a trend we expect to continue” and had shifted some of its marketing spending from search to other means of advertising.

They Call it Domain Diversity

by Barry Schwartz and first posted on Search Engine Roundtable.

Google Change Restricts Two Listings From Same Domain For Most Search Results

Google announced last night, as I covered in detail at Search Engine Land, that they released another update earlier this week that finished rolling out yesterday. This update restricts how many times a search results listing can show in the Google search results for most queries.

In short – Google said for most queries, they will begin only showing up to two listings per domain in the top search results. This does not meant Google will only show two listings from that domain for all search results pages but rather for queries they think is makes sense to show more diversity from different domains for a query. Branded queries may show more than two, and you get the point.

Google Site Diversity in Search

Here is where Google said they have launched this update:

Google SearchLiaison

@searchliaison

Have you ever done a search and gotten many listings all from the same site in the top results? We’ve heard your feedback about this and wanting more variety. A new change now launching in Google Search is designed to provide more site diversity in our results….

562 people are talking about this

Two Results Max For Most Queries

Google said they generally won’t show more than two results from the same domain in the top Google search results. Google may still show more than two results if they determine a searcher would benefit from it and it is relevant for the query.

Google SearchLiaison

@searchliaison

This site diversity change means that you usually won’t see more than two listings from the same site in our top results. However, we may still show more than two in cases where our systems determine it’s especially relevant to do so for a particular search….

75 people are talking about this

Subdomains & Root Domains Are The Same

For the most part, Google will treat subdomains as part of the main root domain. So blog.domain.com would count towards domain.com being listed in the top search results. Of course, there are examples when Google won’t consider them to be part of the same domain, like some hosting platforms will use subdomains to host unique sites – and Google will consider those separate.

Google SearchLiaison

@searchliaison

Site diversity will generally treat subdomains as part of a root domain. IE: listings from subdomains and the root domain will all be considered from the same single site. However, subdomains are treated as separate sites for diversity purposes when deemed relevant to do so….

57 people are talking about this

Danny Sullivan

@dannysullivan

Say previously you had 3 listings from root.domain, 3 from a.root.domain and 3 from b.root.domain. We’d now generally those as all the same site and show up two in total from them all, rather than up to two per domain or subdomain (and thus six total). But….

Danny Sullivan

@dannysullivan

In some cases, we may decide to treat the subdomains as their own sites, if it seems really relevant to. And none of this involves ranking. We don’t mix the domains together or anything. It’s just about the roll-up when we decide for display.

See Danny Sullivan’s other Tweets

John-Henry Scherck@JHTScherck

I took it as a hint towards platforms that issue subdomains to users, like http://WordPress.com  subdomains from random users. Those shouldn’t impact rankings for http://WP.com , they are totally different sites and should be treated as such.

Danny Sullivan

@dannysullivan

Yes.

See Danny Sullivan’s other Tweets

Different From June 2019 Core Update

Google said this is different and unrelated to the June 2019 core update that launched on June 3rd and was big for some sites.

Google SearchLiaison

@searchliaison

Finally, the site diversity launch is separate from the June 2019 Core Update that began this week. These are two different, unconnected releases.

91 people are talking about this

Diversity Update Started On June 3rd Also

Danny Sullivan said this update started rolling out two days ago and finished yesterday, so technically it overlapped with the June 2019 core update rollout. We don’t know if the June core update is done rolling out but the diversity update is and of course, this can be confusing for SEOs to properly track.

Danny Sullivan

@dannysullivan

It started a little bit about two days ago but went fully live today. Personally, I wouldn’t think of it like an update, however. It’s not really about ranking. Things that ranked highly before still should. We just don’t show as many other pages.

See Danny Sullivan’s other Tweets

He thinks this shouldn’t confuse SEOs:

Barry Schwartz

@rustybrick

but yet you announced it so this change will be “noticeable” and thus impact analytics and search console data, right? it would be nice if you would have held off a week to let the June update roll out more before doing this, right? feedback for the team.

Danny Sullivan

@dannysullivan

We launch things almost every day. Sometimes several in a single day. This is far enough out from the core update release that any stat changes can probably be distinguished.

See Danny Sullivan’s other Tweets

Remco Tensen@RemcoTensen

This effects rankings. For one: you’re awared less slots. That’s direct loss of rankings. Two: less chances to seduce is less interactions with the site. This should impact your site’s rankings. Three: less results for your site shown, means more SERP competition. Esp. on mobile.

Danny Sullivan

@dannysullivan

No, it potentially affects traffic. If you had four different pages all doing the right things in terms of content and quality to reach the top page, then your ranking efforts are on the right track. And this doesn’t mean they aren’t as good. It just means we won’t show as many.

See Danny Sullivan’s other Tweets

Only Impacts Core Web Results

So this only impacts the ten blue listings, the core web results. It does not stop you from showing more than twice because of featured snippets, local listings, images, and other search features according to Danny Sullivan:

Danny Sullivan

@dannysullivan

It’s about the main listings, not various other displays on the search results.

See Danny Sullivan’s other Tweets

Marie Haynes

@Marie_Haynes

Sounds like the diversity update will only affect regular organic results and not affect features like featured snippets, people also asks, etc.

Danny Sullivan

@dannysullivan

Replying to @MichelleRobbins and 4 others

It’s about the main listings, not various other displays on the search results.

Danny Sullivan

@dannysullivan

Yes. This is only about the main web search listings. It not including things like featured snippets, map listings, etc.

See Danny Sullivan’s other Tweets

Domain Based, Not Content Based

This is strictly related to domains and doesn’t look to see if the content is the same across different domain names. This is not a duplicate content thing…

Ben Cook@BenjaminCook

How about searches where all of the results are different sites but all powered by the same database of products?

Danny Sullivan

@dannysullivan

As said in the tweets, it’s focused around domains.

See Danny Sullivan’s other Tweets

It Is Not Perfect

Google said this is not perfect and Google will make updates to it over time:

Tom Waddington@tomwaddington8

How about 8?

View image on Twitter

Danny Sullivan

@dannysullivan

It’s not going to be perfect. As with any of our releases, we’ll keep working to improve it. You might also try it the way someone in Tustin would do it — “nail salons” or “nail salons near me” or “nail salons tustin.” If you’re in Tustin, you know you’re in CA 🙂

See Danny Sullivan’s other Tweets

Danny Sullivan

@dannysullivan

It’s not going to be perfect. As with any of our releases, we’ll keep working to improve it. You might also try it the way someone in Tustin would do it — “nail salons” or “nail salons near me” or “nail salons tustin.” If you’re in Tustin, you know you’re in CA 🙂

Danny Sullivan

@dannysullivan

Oh, and I see one of the screenshots has some of those examples! But as said, things hopefully have improved for a variety of searches, and we’ll keep looking to improve.

See Danny Sullivan’s other Tweets

David Carralon@DavidCarralon

Great news, though it doesn’t seem to work in some verticals, eg: jobs. See my search for a specific job in Google UK, VPN geotargeted to London. I get 4 Indeed listings for irrelevant locations to a Uk user. @JohnMu something to look into maybe?

View image on Twitter

Danny Sullivan

@dannysullivan

Those are completely different domains. Same company; different geo domains. But it’s a good point. I’ll pass this feedback on. We expect to keep improving on the change.

See Danny Sullivan’s other Tweets

Bolocan Cristian@bolocancristian

@searchliaison what do you say about this query from Romania? On first page, first 9 results are http://emag.ro , best brand from Romania.https://prnt.sc/nuvugy 

Screenshot

Captured with Lightshot

prnt.sc

Danny Sullivan

@dannysullivan

We’ll be looking to improve. I’ll pass this on.

See Danny Sullivan’s other Tweets

Google Has Done This Before

Google has updated their domain diversity filters or weights many many times before. We covered it before but Google has done this update countless times – most of the time not telling us.

Sarcasm

And Gary Illyes weighs in:

Gary “鯨理” Illyes

@methode

More diversity in the search results (10 blue). Unrelated to core algo update.
If you lost traffic because of this, please tweet @JohnMu . If you gained traffic, feel very free to tweet at me. kthxbai

Google SearchLiaison

@searchliaison

Have you ever done a search and gotten many listings all from the same site in the top results? We’ve heard your feedback about this and wanting more variety. A new change now launching in Google Search is designed to provide more site diversity in our results….

61 people are talking about this

So there you have it – that is everything we know about this specific update – which you do or don’t have to call an “update” if you don’t want it. Call it a “change” or “feature” or whatever you want.