About Mike Dickman

20-year marketing veteran doing my thing, one click at a time!

Google tells some advertisers it will handle their campaign management

By:  and originally posted on SearchEngineLand.

In seven days, the email says, Google Ads reps will start making changes to advertiser accounts if advertisers don’t opt out.

This article has been updated with additional details and a statement from Google.

“We’ll focus on your campaigns, so you can focus on your business.” That’s the headline in an email some advertisers have begun receiving from Google Ads this week.

Why you should care. Google has steadily introduced automation to just about every area of campaign creation and management (Ads Added By GoogleSmart Campaigns, Local Campaigns, Universal App Campaigns, Responsive Search Ads, Smart Bidding Strategies — you get the idea). But this effort is ostensibly human-powered. Google is bringing in “Google Ads experts” to manage campaigns “behind the scenes.” One can assume, however, that the changes the experts make will be largely influenced by Google’s machine-powered recommendations engine. This kind of program will have immediate implications for the advertisers that have this service turned on, but there are longer term implications for the broader ecosystem of agencies, consultants, clients and paid search practitioners.

Auto opt-in. Better check your email. Unless they opt out, advertisers will be added to the program automatically after seven days of receiving the email. However, Google notes it is possible to opt out later at any point.

What will these experts be doing? According to the email, they’ll identify “key changes that can help you get more out of your ads, from restructuring your ad groups and modifying your keywords to adjusting your bids and updating your ad text.” That’s structure, keywords, bids, ads. They’ll also offer “setup and ongoing activation of advanced features” and “ensure the right features are being activated at the right moment.” What they say they don’t touch are budgets.

Aaron Levy, director of PPC at Elite SEM, tweeted the email Google is sending out to some accounts.

Is Google undercutting agencies and consultants? Google has long had account reps and teams that reach out to advertisers (and agencies) with optimization recommendations and account consultations. It is also not uncommon for agencies to complain that Google reps have reached out to their clients that clearly have accounts under the agency’s management.

Whenever pushed on this, Google’s response is that it partners closely with agencies and consultants — through its partner programs, outreach and other efforts. With this program, the thinking goes, agencies and consultants that have advertisers participating in the program could dedicate more time to strategy and spend less time on tactical workaday tasks.

Pilot program. A Google spokesperson told Search Engine Land, “Our sales teams are always looking for ways to help customers get the best results from Google Ads. We are rolling out a pilot program that we believe will help businesses optimize their accounts. As always, we build customer feedback into the final product. Customers are in full control of the account and can accept or reject recommendations as they desire.”

Advertisers in the program will be notified of suggested optimizations and new features via email and can opt-out at any point.

It’s not clear what experience or training the Google Ads experts who will be optimizing campaigns have. Google does tout the knowledge it has gleaned from “optimizing over 800,000 Google Ads accounts.” The company has millions of advertisers, and again it’s unclear what kinds of accounts these are and what kind of hand Google has had in optimizing them.

Google is careful to say that advertisers are still responsible for the results of their campaigns and shouldn’t put blind faith in its optimization efforts. In a disclaimer (posted by Levy) the company says it “doesn’t guarantee or no promise any particular results from implementing these changes” made by its experts. Advertisers are still encouraged to “monitor your account regularly so you understand what’s happening and can make campaign adjustments.”

It should also be noted that if the recommendations have a negative impact on results, Google may offer refunds to advertisers.

A copy of the email message is below.

 

The death of re-marketing and its repercussions on e-commerce

Written by Niamh Reed and originally posted on DigitalDoughnut.

Google has been taking steps to end re-marketing. A bold move, perhaps, and a worrying one for advertisers. Google has recently introduced an addition to their ad-blocking capability that allows consumers to block all your re-marketing ads.  Nor has the organisation stopped there: Google is still developing their ad blockers as you read this.

But does this mark the end of re-marketing, and if so, what does that mean for your e-commerce marketing practices and your business?

The ‘what’s and ‘why’s of re-marketing

Everyone has experienced re-marketing – probably without even realising it initially. Re-marketing refers to those adverts that follow you around the web, reminding you of products that you were browsing the other day.

For customers, it’s a creeping reminder that you might still need to buy that pair of orange wellies you were looking at yesterday. For businesses, it’s a highly cost-effective tool, increasing their sale of orange wellies and decreasing cart abandonment.

Re-marketing isn’t actually marketing at all. It’s a sales tactic. Mike Michalowicz

Re-marketing can be incredibly effective. In fact, a website visitor who’s been retargeted with a display ad is 70% more likely to convert. It works by creating multiple impressions of the product, making it stick in the customer’s mind until they decide to buy.

Google – re-marketing’s killer?

As useful as re-marketing may be, it’s becoming increasingly difficult to rely on. Brands, prepare: Google is going after your adverts.

On January 25 Google announced an addition to their ad blocking capabilities that allowed customers to block re-marketing ads. They’re providing your prospects even more ways of resisting your attempts to court them through re-marketing.

And it’s not just the particular ad that happened to annoy the customer getting blocked. It’s you. Every advert you direct at that customer is now being blocked by Google for 90 days. Plus, Google is continuing to fine-tune its ad blockers, and your ads aren’t just being blocked on one device, but across all the customer’s devices.

Blessing in disguise?

Of course, re-marketing has always come with its cons. There is a question as to whether the overuse of re-marketing technology and tactics comes at the expense of annoyed customers. After you’ve seen the same ad for ages, for a product you’ve already bought, it can be grating to still have it shoved in your face.

Evidently, Google agrees that repetitive re-marketing ads have a negative impact on user experience. Could Google even be doing businesses a favour?

It certainly is the case that the inability to rely on re-marketing encourages a better customer experience on site – after all, it’s just become much harder to tempt them back again. Besides, re-marketing was only annoying if you overused it. For companies that use it well, this new Google move might not be such a painful one.

Google won’t automatically block your adverts (for now). So, if customers like your adverts (or at least, don’t find them annoying), you’ve nothing to worry about. Except, there’s no way to guarantee that.

Real-time conversions

So, what do you need to do to convert amid the death of re-marketing? You need to focus on converting your online shoppers while they are live on your page, browsing your digital store. The adage goes that a bird in the hand is worth two in the bush, and the same goes for your customers. A customer in your store (or on your page) is worth two on the street (or scrolling through Google).

The death of re-marketing means that there needs to be an increased emphasis on proactive, real-time service when your customers are browsing your page. With tools such as live chat software, you can provide your customers with real-time, on-site support that encourages them to stay on your website. Because they don’t need to wait for answers to queries, customers are less inclined to leave your site before making their purchase – your team can put any worry to bed. It’s the equivalent of an attentive customer service employee in a retail store.

Convert, convert, convert

With Google making it harder for you to stalk your customers once they leave your site, you need to place your focus on optimising the customer experience while you have their attention.

For brands, the death of re-marketing needn’t be the death of conversions. Refocus your efforts on providing proactive, real-time support for your website prospects, and on continually enhancing their experience.

Don’t rely on stalking consumers through re-marketing – Google may well just stop you in your tracks.​

A case study: Do Google Posts impact ranking?

This article was written by Joy Hawkins and first appeared on Search Engine Land.

A few months ago, I teamed up with Ben Fisher from Steady Demand to test whether Google Posts have any influence on ranking in the local results (the “3-pack”).

The methodology

We picked two different businesses to test this on. One was a garage flooring company that had been struggling to rank for their main keywords in the 3-pack. The second was my church, which does not have any SEO or marketing efforts going on. The team at Steady Demand made two posts every seven days in Google My Business from August 11, 2017, to October 1, 2017.

Case 1: Garage flooring in Vancouver

For this business, we saw their ranking in the local results increase one position on “garage flooring Vancouver.”  It moved up from position four to position three, winning them a spot in the 3-pack, less than a week after they started posting.

For just “garage flooring” (implicit search), they increased from position seven to two about four days after the posts started. I double-checked to make sure they didn’t receive any new reviews a few days prior to the increase, since they did receive a few new reviews during the course of the test.

Case 2: Church in Keswick, Ontario

For my church, we were mainly tracking how they ranked for “church” keywords. They increased from position five to position three for “church keswick, on” but did so gradually after the posting started.

Unlike the garage flooring company, the posts on the church’s listing also drove a significant amount of traffic to the website. We used UTM codes to track the traffic in Google Analytics. Most of it came from mobile, and about half the traffic came from within the target area (the rest was out-of-state).

The post responsible for the largest amount of traffic was actually a bio/spotlight of one of the pastors. Google My Business showed zero engagement on the post despite its driving 74 new users to the website.

Conclusions:

  1. Based on what we’re seeing for this case and others we’re testing, I think Google Posts do have a mild impact on ranking. These tests were purposely done in non-competitive industries, so it might not be enough to produce movement in some scenarios.
  2. Google Posts are low-impact, low-effort tasks. They should be combined with other tasks to help improve Local SEO for a small business.
  3. The rankings maintained themselves weeks after we stopped posting on the listings. This is different from what has been observed about posting on social platforms such as Google Plus.
  4. Google My Business Insights are wrong. You need to use UTM codes on your URLs to get proper insights on these in Google Analytics.

Google Posts have been surprisingly underutilized by businesses and agencies, partly due to the past inability to schedule posts. With the recent feature addition to create and update posts via the Google My Business API, this is an opportunity for people to start using Google Posts more.

Google Ignores rel=shortlink Link Attribute

This was originally posted on Search Engine Roundtable and with authorship rights to Barry Schwartz.

Google’s John Mueller said on Twitter that Google ignores the rel=shortlink link attribute.

Th proposed usage of this is by adding rel=”shortlink” to a hyperlink, a page indicates that the hyperlink may be used for space constrained and/or manual entry (e.g. printed or spoken) applications and that the destination of that hyperlink carries the same meaning (even if formatting such as sort order and highlighting is lost). Typical use cases include pasting links into microblogging services such as Twitter and anywhere manual entry is required (e.g. printed or spoken URLs).

 

 twitter

 

Either way, if you want Google to pass signals from your short URLs, make sure they 301 to the main URL.

 

SEO 101: Which URL versions to add to Google Search Console

This article first appeared on Search Engine Land and was written by Fili Wiese.

Google Search Console serves as an excellent (not to mention free) source of technical data about your website’s organic visibility and performance. To maximize its usefulness, it’s important to properly set up your website in Search Console by adding all versions of your domain as properties that you manage.

Let’s assume the domain name of the website is https://example.com/.

The first step here is to add the following to Google Search Console as a new property:

     example.com

Make sure to verify the domain name, preferably using the a TXT record or CNAME record in the DNS.

Next, add the www version as a property (even if it redirects to the non-www version):

     www.example.com

In this case, both URLs above redirect to the HTTPS version of the website (learn how to move your website to HTTPS). That means that these variations will also need to be added as two separate properties in Google Search Console:

     https://example.com/
     https://www.example.com/

Note that you must specifically include “https://” when adding these two properties, which you did not have to do with the HTTP version. If no protocol is defined when adding a property to Google Search Console, it defaults to the HTTP-protocol.

At this point, the following URLs have been added to Google Search Console as properties, even if the HTTP versions do not serve any content and redirect fully to the HTTPS versions:

     http://example.com/
     http://www.example.com/
     https://example.com/
     https://www.example.com/

To summarize, for any website on its own domain and being served only from the HTTP-protocol, at a bare minimum, two versions of your domain need to be present in Google Search Console. For any website on its own domain and being served from the HTTPS protocol, at a bare minimum, fourversions of your domain need to be present in Google Search Console.

Getting more data from Google Search Console

If the website has any subdomains, or language/country/content or otherwise specific subdirectories, it will be beneficial to add these properties separately in Google Search Console. Doing so will allow you to get more data, set geographic targets or define specific site maps. (Note that this also includes subdomains that are not meant for indexing, such as staging servers, or have no data available, such as an admin login subdomain.)

Let’s assume the website has two additional subdomains (blog and news), two language subdirectories (DE and EN), two content-specific subdirectories (product and amp) and a staging subdomain all on the HTTPS-protocol variation. This means that, in addition to the URLs above, the following additional URLs also need to be added as new properties in Google Search Console:

     https://blog.example.com/
     https://news.example.com/
     https://example.com/de/
     https://example.com/en/
     https://example.com/amp/
     https://example.com/product/
     https://staging.example.com/

To be safe, it is best to also add the following as new properties in Google Search Console:

     http://blog.example.com/
     http://news.example.com/
     http://example.com/de/
     http://example.com/en/
     http://example.com/amp/
     http://example.com/product/
     http://staging.example.com/

And to be extra, extra safe, the following (www versions) can also be added as new properties to Google Search Console:

     https://www.example.com/de/
     https://www.example.com/en/
     https://www.example.com/amp/
     https://www.example.com/product/

And

     http://www.example.com/de/
     http://www.example.com/en/
     http://www.example.com/amp/
     http://www.example.com/product/

Now, Google Search Console can provide additional specific and detailed search-related data, such as Search Analytics data, for each subdomain and subdirectory.

Making the data more useful

If all the URL variations mentioned above are added as properties, there are now 24 separate properties in Google Search Console, each one providing specific and valuable insights on how Google “sees” the website. So it may be hard to know which property to check for ranking data in Google Search Console Search Analytics. Luckily, Google added a new feature called “property sets” last year.

Property sets combine the data from several properties and present the data in a unified view. To create a property set, go to the Google Search Console and click “Create a set.” Next, give the set a name and add previously verified Google Search Console properties to the set.

There are various property sets you may find useful in terms of data segmentation; below are my suggestions for grouping properties together.

All data property set

To get one source for all ranking data in Google Search Console for the website, add all 24 properties to one property set (highly recommended):

     http://example.com/
     http://www.example.com/
     https://example.com/
     https://www.example.com/
     https://blog.example.com/
     https://news.example.com/
     https://example.com/de/
     https://example.com/en/
     https://example.com/amp/
     https://example.com/product/
     https://www.example.com/de/
     https://www.example.com/en/
     https://www.example.com/amp/
     https://www.example.com/product/
     https://staging.example.com/
     http://blog.example.com/
     http://news.example.com/
     http://example.com/de/
     http://example.com/en/
     http://example.com/amp/
     http://example.com/product/
     http://www.example.com/de/
     http://www.example.com/en/
     http://www.example.com/amp/
     http://www.example.com/product/
     http://staging.example.com/

English language data

To narrow the ranking data in Google Search Console for the English part of the website, group the following into another property set:

     https://example.com/en/
     https://www.example.com/en/
     http://example.com/en/
     http://www.example.com/en/

German language data

To narrow the ranking data in Google Search Console for the German part of the website, group the following into another property set:

     https://example.com/de/
     https://www.example.com/de/
     http://example.com/de/
     http://www.example.com/de/

News/blog data

To narrow the ranking data in Google Search Console for the news/blog part of the website, group the following into a property set:

     https://blog.example.com/
     http://blog.example.com/
     https://news.example.com/
     http://news.example.com/

Product page data

To narrow the ranking data in Google Search Console for just the product part of the website, group the following into a property set:

     https://example.com/product/
     https://www.example.com/product/
     http://example.com/product/
     http://www.example.com/product/

Keep track of staging URLs

To make sure none of the staging URLs are indexed, add the following to another property set:

     https://staging.example.com/
     http://staging.example.com/

Continue creating new property sets in Google Search Console if it makes sense for your business. Keep in mind that property sets do not show data retroactively — they only start collecting data from the moment they are created, and it can take several days before the first data becomes available for the user. Thus, creating a property set sooner rather than later is in the site owner’s best interest.

Just a start…

A great Google Search Console setup is just the first step towards maximizing your SEO efforts. It is an important one, though.

The sample data provided by Google can help improve your rankings, help Googlebot better understand the website and provide invaluable and otherwise unavailable insights into your organic visibility and performance. It is also possible to download sample data through an API, integrate the data with internal data and bring your SEO to the next level.

Adding the right properties to Google Search Console is a priority because you never know when your business may need the data. And it’s free — so what are you waiting for?

11 Lessons Learned from Failed Link Building Campaigns

This post originally appeared on Moz and was written by Kerry Jones.

We’ve created more than 800 content campaigns at Fractl over the years, and we’d be lying if we told you every single one was a hit.

The Internet is a finicky place. You can’t predict with 100% accuracy if your content will perform well. Sometimes what we think is going to do OK ends up being a massive hit. And there have been a few instances where we’d expect a campaign to be a huge success but it went on to garner lackluster results.

While you can’t control the whims of the Internet, you can avoid or include certain things in your content to help your chances of success. Through careful analysis we’ve pinpointed which factors tend to create high-performing content. Similarly, we’ve identified trends among our content that didn’t quite hit the mark.

soup-nazi.gif

In this this post, I’ll share our most valuable lessons we learned from content flops. Bear in mind this advice applies if you’re using content to earn links and press pickups, which is what the majority of the content we create at Fractl aims to do.

1. There’s such a thing as too much data.

For content involving a lot of data, it can be tempting to publish every single data point you collect.

A good example of this is surveying. We’ve fallen down the rabbit hole of not only sharing all of the data we’ve collected in a survey, but also segmenting the data out by demographics — regardless of whether or not all of that data is super compelling. While this can give publishers a large volume of potential angles to choose from, the result is often unfocused content lacking a cohesive narrative.

george-and-jerry.gif

Only include the most insightful, interesting data points in your content, even if that means tossing aside most of the data you’ve gathered.

One example of this was a survey we did for a home security client where we asked people about stalker-ish behaviors they’d committed. The juiciest survey data (like 1 in 5 respondents had created a fake social account to spy on someone — yikes!) ended up getting buried because we included every data point from the survey, some of which wasn’t so interesting. Had we trimmed down the content to only the most shocking findings, it probably would have performed far better.

Furthermore, the more data you include, the more time it takes for a publisher to wade through it. As one journalist told us after we sent over an epic amount of data: “Long story short, this will take too much time.”

Consider this: It shouldn’t take a publisher more than 10 seconds of looking at your project to grasp the most meaningful data points. If they can’t quickly understand that, how will their readers?

2. Turning published data into something cool doesn’t always yield links.

If you’re going to use data that’s already been reported on, you better have a new spin or finding to present. Journalists don’t want to cover the same stats they have already covered.

A great example of this is a project we created about the reasons startups fail. The majority of the data we used came from CB Insights’ startup post mortems list, which had performed really well for them. (As of the time I’m writing this, according to Open Site Explorer it has 197 linking root domains from sites including BBC, Business Insider, Fortune, Vox, CNBC, and Entrepreneur — impressive!)

It worked well once, so it should work again if we repackage it into a new format, right?

We used the startups featured on the CB Insights list, added in a handful of additional startups, and created a sexy-looking interactive node map that grouped together startups according to the primary reasons they went under.

While the content didn’t end up being a failure (we got it picked up by Quartz, woo!), it definitely didn’t live up to the expectations we had for it.

Two problems with this project:

  1. We weren’t saying anything new about the data.
  2. The original data had gotten so much coverage that many relevant publishers had already seen it and/or published it.

But of course, there are exceptions. If you’re using existing data that hasn’t gotten a ton of coverage, but is interesting, then this can be a smart approach. The key is avoiding data that has already been widely reported in the vertical you want to get coverage in.

3. It’s difficult to build links with videos.

Video content can be extremely effective for viral sharing, which is fantastic for brand awareness. But are videos great for earning links? Not so much.

When you think of viral content, videos probably come to mind — which is exactly why you may assume awesome videos can attract a ton of backlinks. The problem is, publishers rarely give proper attribution to videos. Instead of linking to the video’s creator, they just embed the video from YouTube or link to YouTube. While a mention/link to the content creator often happens organically with a piece of static visual content, this is often not the case with videos.

Of course, you can reach out to anyone who embeds your video without linking to you and ask for a link. But this can add a time-consuming extra step to the already time-intensive process of video creation and promotion.

4. Political ideas are tough to pull off.

Most brands don’t want to touch political topics with a ten-foot pole. But to others, creating political content is appealing since it has strong potential to evoke an emotional reaction and get a lot of attention.

kramer-jerry.gif

We’ve had several amazing political ideas fail despite solid executions and promotional efforts. It’s hard for us to say why this is, but our assumption has been publishers don’t care about political content that isn’t breaking (because it’s always breaking). For this reason, we believe it’s nearly impossible to compete with the constant cycle of breaking political news.

5. Don’t make content for a specific publisher.

We’ve reached out to publishers to collaborate during content production, assuming that if the publisher feels ownership over the content and it’s created to their specifications, they will definitely publish it.

In general, we’ve found this approach doesn’t work because it tends to be a drain on the publishers (they don’t want to take on the extra work of collaborating with you) and it locks you into an end result that may only work for their site and no other publishers.

Remember: Publishers care about getting views and engagement on their site, not link generation for you or your client.

6. Hyperlocal content is a big risk.

If you focus on one city, even with an amazing piece of content featuring newsworthy information, you’re limited in how many publishers you can pitch it to. And then, you’re out of luck if none of those local publishers pick it up.

On the flip side, we’ve had a lot of success with content that features multiple cities/states/regions. This allows us to target a range of local and national publishers.

Note: This advice applies to campaigns where links/press mentions are the main goal – I’m not saying to never create content for a certain locality.

7. Always make more than one visual asset.

And one of those assets should always be a simple, static image.

Why?

Many websites have limits to the type of media they can publish. Every publisher is able to publish a static graphic, but not everyone can embed more complex content formats (fortunately, Moz can handle GIFs).

george-and-kramer.gif

In most cases, we’ve found publishers prefer the simplest visualizations. One classic example of this is a project where we compared reading levels and IQ across different states based on a analysis of half a million tweets. Our Director of Creative, Ryan Sammy, spent a painstaking amount of time (and money) creating an interactive map of the results.

What did most publishers end up featuring? A screenshot of a Tableau dashboard we had sent as a preview during outreach…

8. Be realistic about newsjacking.

Newsjacking content needs to go live within 24 to 48 hours of the news event to be timely. Can you really produce something in time to newsjack?

We’ve found newsjacking is hard to pull off in an agency setting since you have to account for production timelines and getting client feedback and approval. In-house brands have a more feasible shot at newsjacking if they don’t have to worry about a long internal approval process.

9. Watch out for shiny new tools and content formats.

Just because you are using cool, new technology doesn’t automatically make the content interesting. We’ve gotten caught up in the “cool factor” of the format or method only to end up with boring (but pretty) content.

10. Avoid super niche topics.

You greatly increase your risk of no return when you go super niche. The more you drill down a topic, the smaller your potential audience becomes (and potential sites that will link become fewer, too).

There are a ton of people interested in music, there are fewer people interested in rap music, there are even fewer people interested in folk rap music, and finally, there are so few people interested in ’90s folk rap. Creating content around ’90s folk rap will probably yield few to no links.

Some questions to ask to ensure your topic isn’t too niche:

  • Is there a large volume of published content about this topic? Do a Google search for a few niche keywords to see how many results come up compared to broader top-level topics.
  • If there is a lot of content, does that content get high engagement? Do a search in Buzzsumo for keywords related to the niche topic. Is the top content getting thousands of shares?
  • Are people curious about this topic? Search on BloomBerry to see how many questions people are asking about it.
  • Are there online communities dedicated to the topic? Do a quick search for “niche keyword + forum” to turn up communities.
  • Are there more than 5 publishers that focus exclusively on the niche topic?

11. Don’t make content on a topic you can’t be credible in.

When we produced a hard-hitting project about murder in the U.S. for a gambling client, the publishers we pitched didn’t take it seriously because the client wasn’t an authority on the subject.

From that point on, we stuck to creating more light-hearted content around gambling, partying, and entertainment, which is highly relevant to our client and goes over extremely well with publishers.

It’s OK to create content that is tangentially related to your brand (we do this very often), but the connection between the content topic and your industry should be obvious. Don’t leave publishers wondering, why is this company making this content?”

costanza.gif


Learning from failure is crucial for improvement.

Failure is inevitable, especially when you’re pushing boundaries or experimenting with something new (two things we try to do often at Fractl). The good news is that with failure you tend to have the greatest “a-ha!” moments. This is why having a post-campaign review of what did and didn’t work is so important.

Getting to the heart of why your content is rejected by publishers can be extremely helpful — we collect this information, and it’s invaluable for spotting things we can tweak during content production to increase our success rate. When a publisher tells you “no,” many times they will give a brief explanation why (and if they don’t, you can ask nicely for their feedback). Collect and review all of this publisher feedback and review it every few months. Like us, you may notice trends as to why publishers are passing up your content. Use these insights to correct your course instead of continuing to make the same mistakes.

And one last note for anyone creating content for clients: What should you do when your client’s campaign is a flop? To mitigate the risk to our clients, we replace a campaign if it fails to get any publisher coverage. While we’ve rarely had to do this, putting this assurance in place can give both you and your client peace of mind that a low-performing campaign doesn’t mean their investment has gone to waste.

What have you observed about your content that didn’t perform well? Does your experience contradict or mirror any of the lessons I shared?

Google, Is, Okay, With, Commas, Even, In, Your, Title, Tags

This article first appeared on Search Engine Roundtable and is authored by Barry Schwartz.

I am really not sure where these ideas come from, which is why I have stuff to write daily but this guy asked John Mueller if putting commas in title tags are okay or not. John said it is “totally up to you!” Here is the tweet:

tweet

I mean, why would you not be allowed to use commas in your titles? I am not understanding the logic of not using commas?

People, like me, over use commas, when we write. But other then that, using commas when it makes sense, even in titles, does make sense.

See what I am doing here?

So use commas where it makes sense and do not worry about how search engines will react to it. Google will handle it just fine. Make sure it doesn’t bother your readers.

Social Media Advertising Hacks

Written by Larry Kim, founder of WordStream.

Let’s start with the bad news first. It’s tougher than ever to get content noticed.

Changes to Google search results pages have further obscured content organically, especially on competitive commercial searches. Meanwhile, paid search costs per click (CPCs) are at all-time highs in established markets.

Organic reach in social media? It’s pretty much dead. Half of all content gets zero shares, and less than 0.1 percent is shared more than 1,000 times.

Additionally, the typical internet marketing conversion rate is less than one percent.

How Content Marketing Doesn’t (Usually) Work

How does content marketing actually work? Many people’s content marketing strategy basically consists of a three-step process:

  1. Create new content.
  2. Share your content on social networks (Facebook, Twitter, LinkedIn, etc.).
  3. People buy your stuff.

Nope. This almost never happens. (For a content marketing strategy that actually works, try a documented plan such as the Content Marketing Pyramid™.)

Most content goes nowhere. The consumer purchase journey isn’t a straight line—and it takes time.

So is there a more reliable way to increase leads and sales with content?

Social Media Advertising To The Rescue!

Now it’s time for the good news! Social media advertising provides the most scalable content promotion and is proven to turn visitors into leads and customers.

And the best part? You don’t need a huge ad budget.

A better, more realistic process for content marketing with promotion looks like this:

  1. Create: Produce content and share it on social media.
  2. Amplify: Selectively promote your top content on social media.
  3. Tag: Build your remarketing audience by tagging site visitors with a cookie.
  4. Filter: Apply behavioral and demographic filters on your audience.
  5. Remarketing: Remarket to your audience with display ads, social ads, and Remarketing Lists for Search Ads (RLSA) to promote offers.
  6. Convert: Capture qualified leads or sale.
  7. Repeat.

Promotion is sorely overlooked from many content marketers’ priority list—it’s actually the lowest priority according to a recent study of 1000+ marketers. For marketers who take promotion more seriously, The Ultimate List of Content Promotion Tools is a godsend.

You can use the following ten Twitter and Facebook advertising hacks as a catalyst to get more eyeballs on your content, or as an accelerant to create an even larger traffic explosion.

1. Improve Your Quality Score

Quality Score is the metric Google uses to rate the quality and relevance of your keywords and PPC ads, and influences your cost-per-click. Facebook calls their version a “Relevancy Score”:

While Twitter’s is called a “Quality Adjusted Bid”:

Whatever it’s called, Quality Score is a crucial metric. The way to increase Twitter and Facebook Quality Scores is to increase post engagement rates.

A high Quality Score is great: you get a higher ad impression share for the same budget at a lower cost per engagement. On the flip side, a low Quality Score sucks: you get a low ad impression share and a high cost per engagement.

How do you increase engagement rates? Promote your best content—your unicorns (the top 1-3 percent of content that performs better than everything else) rather than your donkeys (the bottom 97 percent of your content).

To figure out if your content is a unicorn or donkey, test it out.

  • Post lots of stuff (organically) to Twitter and use Twitter Analytics to see which content gets the most engagement.
  • Post your top stuff from Twitter organically to LinkedIn and Facebook. Again, track which posts get the most traction.
  • Pay to promote the unicorns on Facebook and Twitter.

The key to paid social media advertising is to be picky. Cast a narrow net and maximize those engagement rates.

2. Increase Engagement With Audience Targeting

Targeting all your fans isn’t precise; it’s lazy and wastes a lot of money.

Your fans aren’t a homogenous blob. They all have different incomes, interests, values, and preferences.

For example, by targeting fans of Donald Trump, people with social media marketing job titles, NRA members, and the hashtag #NeverHillary (and excluding Democrats, fans of Hillary Clinton, and the hashtag #neverTrump), this tweet for an Inc. article I wrote got ten times higher engagement:

Keyword targeting and other audience targeting methods helps turn average ads into unicorns.

3. Generate Free Clicks From Paid Social Media Advertising

On Twitter, tweet engagements are the most popular type of ad campaign. Why? I have no idea. You have to pay for every user engagement (whether someone views your profile, expands your image, expands your tweet from the tweet stream, or clicks on a hashtag).

If you’re doing this, you need to stop—now. It’s a giant waste of money and offers the worst ROI.

Instead, pay only for the thing that matters most to your business, whether it’s clicks to your website, app installs, followers, leads, or actual video views.

For example, when you run a Twitter followers campaign, you pay only when someone follows you. But your tweet promoting one of your unicorn pieces of content will also get a ton of impressions, retweets, replies, mentions, likes, and visits to your website. All for the low, low cost of $0.

4. Promote Unicorn Video Ads!

Would you believe you can get thousands of video views at a cost of just $0.02 per view?

Shoppers who view videos are more likely to remember you, and buy from you. Quick tips for success:

  • Promote videos that have performed the best (i.e., driven the most engagement) on your website, YouTube, or elsewhere.
  • Make sure people can understand your video without hearing it— an amazing 85 percent of Facebook videos are watched without sound, according to Digiday.
  • Make it memorable, try to keep it short, and target the right audience.

Bonus: video ad campaigns increase relevancy score by two points!

5. Score Huge Wins With Custom Audiences

True story: a while back I wrote an article asking: Do Twitter Ads Work? To promote the article on Twitter, I used their tailored audiences feature to target key influencers.

The very same day, Business Insider asked for permission to publish the story. So I promoted that version of the article to influencers using tailored audiences.

An hour later, a Fox News producer emailed me. Look where I found myself:

The awesome power of custom audiences resulted in additional live interviews with major news outlets including the BBC; 250 high-value press pickups and links, massive brand exposure, 100,000 visits to the WordStream site, and a new business relationship with Facebook.

This is just one example of identity-based marketing using social media advertising. Whether it’s Twitter’s tailored audiences or Facebook’s custom audiences, this opens a ton of new and exciting advertising use cases!

6. Promote Your Content On More Social Platforms

Medium, Hacker News, Reddit, Digg, and LinkedIn Pulse all send you massive amounts of traffic. It’s important to post content to each that’s appropriate to the audience.

Post content on Medium or LinkedIn. New content is fine, but repurposing existing content is a better strategy because it gives a whole new audience the chance to discover and consume your existing content.

Again, use social media advertising as either a catalyst or an accelerant to get hundreds, thousands, or even millions of views you otherwise wouldn’t have. It might even open you up to syndication opportunities—I’ve had posts syndicated to New York Observer and Time Magazine.

You can also promote existing content on sites like Hacker News, Reddit, or Digg. Getting upvotes can create valuable exposure that sends tons of traffic to your existing content.

For a minimal investment, you can get serious exposure and traffic!

7. Hacking RankBrain for Insanely Awesome SEO

Google is using an AI machine learning system called RankBrain to understand and interpret long-tail queries, especially on queries Google has never seen before—an estimated 15 percent of all queries.

I believe Google is examining user engagement metrics (such as click-through rates, bounce rates, dwell time, and conversion rates) as a way—in part, to rank pages that have earned very few or no links.

Even if user engagement metrics aren’t part of the core ranking algorithm, getting really high organic CTRs and conversion rates has its own great rewards:

  • More clicks and conversions.
  • Better organic search rankings.
  • Even more clicks and conversions.

For example, research found a 19 percent lift in paid search conversion volume and a 10 percent improvement in cost per action (CPA) with exposure to Facebook ads for the financial services company Experian.

Use social media advertising to build brand recognition and double your organic search clickthrough and conversion rates!

8. Social Media Remarketing

Social media remarketing, on average, boosts engagement by three times and doubles conversion rates, while cutting your costs by a third. Make the most of it!

Use social media remarketing to push your hard offers, such as sign-ups, consultations, and downloads.

9. Combine Everything With Super Remarketing

Super remarketing is the awesome combination of remarketing, demographics, behaviors, and high engagement content. Here’s how and why it works.

  • Behavior and interest targeting: These are the people interested in your stuff.
  • Remarketing: These are the people who have recently checked out your stuff.
  • Demographic targeting: These are the people who can afford to buy your stuff.

If you target paid social media advertising to a narrow audience that meets all three criteria using your high engagement unicorns—the result?

10. Combine Paid Search & Social Media Advertising

For our final, and most advanced hack of them all, we combine social media advertising with PPC search ads on Google using Remarketing Lists for Search Ads (RLSA).

RLSA is incredibly powerful. You can target customized search ads specifically to people who have recently visited your site when they search on Google. It increases click-through and conversion rates by three times and reduces cost-per-click by a third.

There’s one problem. By definition, RLSA doesn’t target people who are unfamiliar with your brand. This is where social media advertising comes in: it helps more people become familiar with your brand.

Social media advertising is a cheap way to start the process of biasing people towards you. While they may not need what you’re selling now, later, when the need arises, people are much more likely to do a branded search for your stuff, or click on you during an unbranded search because they remember your compelling content.

If your content marketing efforts are struggling, these ridiculously powerful Twitter and Facebook advertising hacks will turn your content donkeys into unicorns! Looking for another awesome hack to supercharge your content ROI? Social curation enables more consistent content publication, supports your created content strategy, and helps you keep track of your favorite information. Download The Ultimate Guide to Content Curation eBook.

Google: Parameter Handling In Google Search Console Is A Big Gun, Watch Out

This article was originally featured on Search Engine Roundtable and was written by Barry Schwartz. 

Do you use the parameter handling feature within the Google Search Console interface? If so, Google’s Gary Illyes wants you to be careful. He told one webmaster that using that feature is considered not just a hint, but rather it is a directive. He added that using it is also a “big gun” and you should “watch out with it.”

Here is what Gary posted on Twitter:

twitter

This tool is used to help you control how Google understands your web site and URLs, especially for dynamic URLs. To learn more about it, see this Google help document.

The feature has been in Google Search Console since 2009 so it is a solid feature.

Google warns against misusing links in syndication & large-scale article campaigns

This article was originally posted on SearchEngineLand and was written by Danny Sullivan.

If the primary purpose of distributing content is to gain links, both authors and publishers risk a Google penalty.

Google’s out today with a warning for anyone who is distributing or publishing content through syndication or other large-scale means: Watch your links.

Google’s post reminds those who produce content published in multiple places that, without care, they could be violating Google’s rules against link schemes.

No content marketing primarily for links, warns Google

Google says that it is not against article distribution in general. But if such distribution is done primarily to gain links, then there’s a problem. From the post:

Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site …

For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users.

Those pushing such content want links because links — especially from reputable publishers — are one of the top ways that content can rank better on Google.

Warning signs

What are things that may tip Google into viewing a content distribution campaign as perhaps violating its guidelines? Again, from the post:

Stuffing keyword-rich links to your site in your articles

Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites

Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on

Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site

Staying safe

There are two safe ways for those distributing content to stay out of trouble: using nofollow on specific links or the canonical tag on the page itself.

Nofollow prevents individual links from passing along ranking credit. Canonical effectively tells Google not to let any of the links on the page pass credit.

Publishers can be at risk, too

It’s important to note that Google’s warning isn’t just for those distributing content. Those publishing it can face issues with Google if they haven’t taken proper care. From Google’s post:

When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking.

Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?

In other words, publishing content unquestioningly, in terms of links, could expose the publisher’s site to being penalized in Google.

Why this new warning?

Today’s warning from Google is generally the same as what it issued back in July 2013, when it cautioned about links in large-scale guest posting, advertorials, sponsored content and press releases. However, it’s more specific in terms of syndication and comes because of an issue that Search Engine Land has been investigating over the past month.

Search Engine Land has a policy of generally not writing about cases of spam or suspected spam that aren’t already public in a significant way. Our open letter from 2014 explains this more. In short, if we did this, that’s all we would ever be writing about.

That said, we received a tip about several businesses using article syndication that seemed worth taking a closer look at, given that the tactics potentially violated Google’s guidelines in a significant manner. Moreover, Google had been notified of the issue at the end of last year, twice, but had not apparently taken any action. The company tipping us — a competitor with those businesses — was concerned. Was this tactic acceptable or not?

The many examples I looked at certainly raised concerns. Articles were distributed across multiple news publications. The articles often contained several links that were “anchor rich,” meaning they appeared to have words within the links that someone hoped they might rank well for. Mechanisms for blocking these links from passing credit were not being used.

Google’s initial response to our questions about this was that it was aware there were issues and that it was looking to see how it might improve things.

That seemed a weak response to me. It was pretty clear from my conversations with two of the companies distributing the content, and one of the publishers, that there was, at the very least, confusion about what was acceptable and responsibilities all around.

Confusion about what’s allowed

Both the companies producing content professed that they felt they were doing nothing wrong. In particular, they never demanded that publishers carry any particular links, which seemed to them to put them on the right side of the guidelines. One also said that it was using canonical to block link credit but that the publishers themselves might be failing to implement that correctly. Both indicated that if they weren’t doing things correctly, they wanted to change to be in compliance.

In short: it’s not us to blame, it’s those publishers. And from the content I looked at on publisher sites, it was pretty clear that none of them seemed to be doing any policing of links. That was reinforced after I talked with one publisher, which told me that while it did make use of nofollow, it was reviewing things to be more “aggressive” about it now. My impression was that if nofollow was supposed to be used, no one had really been paying attention to that — nor was I seeing it in use.

In the end, I suggested to Google that the best way forward here might be for them to post fresh guidance on the topic. That way, Search Engine Land wasn’t being dragged into a potential spam reporting situation. More important, everyone across the web was getting an effective “reset” and reeducation on what’s allowed in this area.

Getting your house in order

Now that such a post has been done, companies distributing such content and publishers carrying it would be smart to follow the advice in it. When Google issues such advice, as it did about guest blogging in January 2014, that’s often followed by the search engine taking action against violators a few months later.

From a distributor point of view, I’d recommend thinking strongly about how Google ended today’s blog post:

If a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything–including links — will follow (no pun intended).

Bottom line: Deep down, you know if you were putting out this content primarily to gain links. If that was the case, you should work with those publishers to implement nofollow or canonical. If you can’t, then you should consider disavowing links to Google.

Going forward, I’d look to implement nofollow or canonical as Google recommends, if you find that the large-scale distribution is bringing you useful direct clicks and attention.

I will say that no one should take this to mean that you can never distribute content or that content can’t have any links at all that pass credit back to an originating site. Indeed, we have plenty of contributed content here on Search Engine Land. I’d be among the first screaming at Google if I thought it was trying to tell us or anyone that you couldn’t have such content unless you blocked all links.

Things that make us feel Google-safe are that, most of all, we publish original content from contributors. It’s not the same content that’s simply dumped into multiple publications. Also, we have editors who often spend a significant amount of time working with writers and content to ensure that it’s publication-worthy. And we do try to watch for links that we don’t feel are earned or necessary in a story.

We’re not perfect. No publisher will be. But I think from a publisher perspective, the more you are actually interacting with the content you publish to review and approve it, rather than blindly posting from a feed, the safer you will be. If you haven’t been doing that, then consider making use of nofollow and canonical on already-published content, as Google recommended.

As for those guest blogging requests

I’ll conclude with this part of Google’s post today:

Webmasters generally prefer not to receive aggressive or repeated “Post my article!” requests, and we encourage such cases to be reported to our spam report form.

Indeed. It’s amazing how many requests that we’re getting like this each day, and I know were not alone. It’s even more amazing when this type of guest blogging was supposed to be over.

Stick A Fork In It, Guest Blogging Is Done,” declared Matt Cutts in January 2014. Cutts, no longer at Google, was then the head of its web spam fighting team. His declaration was a shot heard around the web. Guest blogging almost became radioactive. No one seemed to want to touch it, much less send out idiotic bulk emails requesting a post.

Those requests are back in force. It’s a pity that so many come from Google’s own Gmail system, where all Google’s vaunted machine learning doesn’t catch them as the spam they are.

If you’ve been making such requests or accepting guest blog posts because of them, even in small scale, Google’s rules about policing links still apply.