Google is extending in-market audience targeting to Search campaigns

This article first appeared on SearchEngineLand and was written by Ginny Marvin (@ginnymarvin).

Advertisers will be able to target users based on purchase intent signals in Search campaigns for more than a dozen categories.

Google is continuing to extend its audience targeting capabilities into Search. The company announced Tuesday that In-market audiences, currently only available for Display Network and YouTube campaigns, will be coming to Search campaigns.

Google shared the news in a blog post released ahead of its annual live-streamed event, Google Marketing Next.

First introduced in 2013 under the name In-market buyers, the targeting is aimed at reaching consumers who are getting ready to make a purchase, based on an analysis of intent signals such as recent search queries and website browsing activity. From today’s blog post:

For example, if you’re a car dealership, you can increase your reach among users who have already searched for “SUVs with best gas mileage” and “spacious SUVs.”

There are currently more than a dozen In-market audiences available in AdWords to target users looking to buy things such as apparel, baby products, event tickets or real estate.

 

Along with similar audiences for Search and Shopping, the addition of these targeting options marks Google’s shift to tapping user search history for targeting in Search campaigns. It does so in an aggregated, anonymized way, but the company had long-resisted incorporating that data in Search targeting for privacy reasons. Then Facebook came along and advertiser expectations — and some would say consumer acceptance — of targeting capabilities changed with it.

It’s not clear what the timing will be on the rollout. It took roughly a year for similar audiences for Search and Shopping to roll out generally after Google first announced it at last year’s live-streamed event.

5 Reasons Why Your Business Needs to Start Making Vertical Video for Social Media

First published on SocialMedia Today by Andrew Macarthy.

Does your business record vertical videos for social media?

In years gone by, recording and uploading video with the camera held vertically was looked upon with ridicule, producing big black bars either side of the picture and a narrow viewing angle, guaranteed to turn viewers off.

But times are changing. 

In this post, I’m going to lay out five reasons why your business should be experimenting with vertical video for social media marketing in 2017, and the potential benefits it can bring.

1. People naturally hold their phones vertically 

Obvious, but important.

If we strip smartphones back to their most basic function – giving users the ability to make and receive phone calls – the design of modern smarphones simply follows the tradition of “dumb” phones from decades past; that the device should be held vertically so that the user can speak and listen with minimal fuss. TV and cinema, meanwhile – the dominant visual media for so long – have demanded that the picture is viewed horizontally for the best experience. And so, despite all the things smartphones can now do, we’re historically conditioned to hold phones vertically and view video horizontally.

We’ve been stuck between two competing worlds, but times are changing.

For some hard facts, look to the MOVR Mobile Overview Report from December 2014, which found, unsuprisingly, that smartphone users hold their phones vertically about 94% of the time.

5 Reasons Why Your Business Needs to Start Making Vertical Video for Social Media | Social Media Today

Image credit: Form Meets Function

With the huge increase in mobile-recorded video content online in recent years, it’s no surprise that vertical device usage also on the up.

A 2016 study by KPCB Research showed that people in the US are now spending 29% of time using vertically-held devices, up from just 5% in 2010. And since people are holding these devices vertically for most tasks, it makes sense that they’ll play video that way, too. 

2. People access social media on mobile the most

5 Reasons Why Your Business Needs to Start Making Vertical Video for Social Media | Social Media Today

Not only are the vast majority of mobile apps designed with the assumption that users will be interacting while holding their smartphone vertically, but its increasingly where people are spending their time on social media.

comScore’s 2016 U.S. Cross-Platform Future in Focus study showed that nearly 80% of social media use now occurs on mobile devices – 61% on smartphones alone.

In addition, by 2018 in the US, the gap between desktop and mobile Internet use is predicted to grow to people spending 3 hours and 20 minutes using the Internet on their phone, compared to just 40 minutes on the computer.

As a business, your content needs to be built in a way that caters best to how it is being consumed.

3. Social networks are vertical video-friendly

If we look at how vertical video renders on social apps, we see something interesting.

As of February 2017:

  • Facebook – Vertical videos publish with no black borders
  • Instagram – Vertical videos publish with no black borders
  • Snapchat – Vertical videos publish with no black borders
  • Twitter – Vertical videos publish with no black borders
  • YouTube – The Android version of the app hides black borders when device is held vertically and video is viewed in full screen

Rather than looking upon vertical video as a negative, social networks – even YouTube – are embracing the format. There’s no way they’re going to be able force people to film in landscape mode without really annoying them(as YouTube used to do), so why not make the viewing experience (inferior to landscape mode as it may be) as optimum as it can be?

And there are other benefits. When it comes to watching live video, viewers holding their phones vertically can engage with reactions and comments in a way that’s natural to them.

Rolling out the 2:3 ratio of vertical video on Facebook in August 2016 (rather than cropping vertical videos into squares), a Facebook representative told Marketing Land at the time that:

“We know that people enjoy more immersive experiences on Facebook, so we’re starting to display a larger portion of each vertical video in News Feed on mobile.”

Facebook wants people to stay on its platform, so it will do whatever it can to suit their needs.

Instagram’s roll-out of vertical video last year shared a similar sentiment.

In a blog post to announce its Stories feature (boasting over 150 million users as of January 2017), Instagram said:

“Square format has been and always will be part of who we are. That said, the visual story you’re trying to tell should always come first, and we want to make it simple and fun for you to share moments the way you want to.”

4. Vertical video ads convert better

5 Reasons Why Your Business Needs to Start Making Vertical Video for Social Media | Social Media Today

Vertical video ads are growing in favor with advertisers as well – and when you understand that it’s the orientation in which users are increasingly consuming video, you see why.

“From a storytelling perspective, this is obviously more exciting,” said Dan Grossman, VP of platform partnerships at VaynerMedia told Mashable. “If we can take up more of the screen that means you’re less distracted. We can capture more of the viewer’s attention.”

In another recent development – the launch of vertical video ads for Instagram Stories in January 2017 – Instagram seemed open to their success: 

“Since the beginning, we’ve been thoughtful about rolling out ads on Instagram to give businesses and consumers the best experience possible. And ad formats are no exception. Portrait has long been available on the platform for posts, and is a common format for consuming mobile content”.

5 Reasons Why Your Business Needs to Start Making Vertical Video for Social Media | Social Media Today

For more evidence, look to Snapchat, who you might say spearheaded the vertical video revolution in social media.

In a pitch to publishers in 2015, Snapchat reported that full screen vertical video ad completion rates were 9x higher than those of horizontal video ads.

The company’s internal research also shows that vertical video ads draw up to 2x higher visual attention vs. comparable platforms.

“Communicate your brand message in a way that fits your phone, the way Snapchatters actually use it.”

5 Reasons Why Your Business Needs to Start Making Vertical Video for Social Media | Social Media Today

In addition, Jason Stein, CEO of Laundry Service, reported success with LG vertical video ads soon after their launch last year. He told Adweek that his had had been receiving CPM rates that were 3x more efficient than standard square videos on Facebook.

5. Your customers are lazy

Whisper it, but it’s true – and as customers ourselves, we’re all guilty of it.

Think of it like this: when users are zipping through mobile sites and social media feeds, they expect the experience to be seamless.

If your video plays in landscape and can be viewed okay, not many people are going to make the effort to turn their phone 90 degrees and tap to expand to full screen. It’s lazy, but it’s the truth.

As a marketer, that means you’re missing out on filling a users’ screen with your ad and keeping their full attention as effectively as possible.

Perhaps this point is best summed up by Zena Barakat, a former New York Times video producer who spent a year researching vertical video. She discovered that many people didn’t reorient their phones to watch horizontal videos in full-screen mode.

“As a person who makes videos, I was like, ‘You’re not seeing it the way we intended it! And they were like, ‘We don’t care!’ They found it so uncomfortable to hold the phone the other way, and they didn’t want to keep switching their phones back and forth.’”

Over to you

What are your thoughts on the vertical video revolution? Will you be experimenting with vertical video on social for your brand?

Get to Know Fred and Modern Google SEO

This article first appeared in WebsiteMagazine by Peter Prestipino.

In late 2016, many in the search engine optimization (SEO) industry knew something big was coming. 

Right on cue, Google made a rather significant update to its search algorithm in early March 2017, code named “Fred,” and the resulting impact is now encouraging many in the digital marketing community to rethink not just their approach to the practice of optimization for search, but also the experience they develop in general.

Let’s take a closer look at the recent update (review a timeline of significant Google algorithm changes over the years at wsm.co/algotime) and get to know “Fred,” the current likes and dislikes of Google when it comes to the Web experience and some current best practices for consistently generating more organic traffic to websites.

Fred’s Focus

Google has acknowledged that it actually makes hundreds of updates each year to its core algorithm (it is being reported, in fact, that Google even released a few other updates at the same time Fred appeared), but its most recent has left many search marketers low on traffic and high on questions.

Fortunately, most updates of this magnitude (and pretty much all updates for that matter) tend to focus on the same variables; either links or content.

In essence, in some way or another, if organic traffic was impacted it was because the site was in violation of the webmaster guidelines on quality. While there is no way to know for sure what the focus of Fred may have been, based on conversations with other search marketers and through some search result research, it appears that the vast majority of the sites impacted seem to be content related and those which were negatively impacted had two things in common – their content was “shallow” and their website prioritized advertising over the experience of the user.

This is, of course, speculation (although informed speculation), but the impact has been quite substantial for sites that leverage a model or approach where advertising in its variety of forms encroaches on the digital experience of the user.

The Aftermath

Just how bad was the Fred update for these types of websites? Some SEOs and webmasters have actually reported 50-90 percent reductions in their organic traffic from Google. As one can imagine, that sort of drop in traffic is serious, but there are some things search marketers can do, and some things that they most certainly should not if they were the focus of this update.

The last thing companies want to do, for example, is to panic, deleting pages without reason or modifying URL structures. While it is possible in the experience of many to get some traffic back over time, most of those that have employed tactics outside the guidance of what’s available in Google’s Webmaster guidelines, are more likely to simply abandon their sites instead of put in the required work to fix their mistakes and get on the right track.

Necessary Steps

Should a website be one of those impacted, and should an enterprise be committed to regaining its rankings and resulting organic traffic, there are some steps that can be taken. If advertising is indeed the reason, consider how the digital property is monetized.

Are there simply too many ad units on the page? Too many popups still appearing for mobile users? A change related to how websites generate revenue may be in order. Should shallow content be to blame, there are also some corrective actions that can be taken, including identifying those pages which suffered a reduction in traffic, and including additional relevant content that is useful to users.

It seems so simple in theory – and it is. The tactical side of course is far more complex, but making these changes on a strategic level will be increasingly necessary if success is in the future plan.

 

How to Generate Content Ideas Using Screaming Frog in 20(ish) Minutes

by Todd McDonald and first published on Moz.com.

A steady rise in content-related marketing disciplines and an increasing connection between effective SEO and content has made the benefits of harnessing strategic content clearer than ever. However, success isn’t always easy. It’s often quite difficult, as I’m sure many of you know.

A number of challenges must be overcome for success to be realized from end-to-end, and finding quick ways to keep your content ideas fresh and relevant is invaluable. To help with this facet of developing strategic content, I’ve laid out a process below that shows how a few SEO tools and a little creativity can help you identify content ideas based on actual conversations your audience is having online.

What you’ll need

Screaming Frog: The first thing you’ll need is a copy of Screaming Frog (SF) and a license. Fortunately, it isn’t expensive (around $150/USD for a year) and there are a number of tutorials if you aren’t familiar with the program. After you’ve downloaded and set it up, you’re ready to get to work.

Google AdWords Account: Most of you will have access to an AdWords account due to actually running ads through it. If you aren’t active with the AdWords system, you can still create an account and use the tools for free, although the process has gotten more annoying over the years.

Excel/Google Drive (Sheets): Either one will do. You’ll need something to work with the data outside of SF.

Browser: We walk through the examples below utilizing Chrome.

The concept

One way to gather ideas for content is to aggregate data on what your target audience is talking about. There are a number of ways to do this, including utilizing search data, but it lags behind real-time social discussions, and the various tools we have at our disposal as SEOs rarely show the full picture without A LOT of monkey business. In some situations, determining intent can be tricky and require further digging and research. On the flipside, gathering information on social conversations isn’t necessarily that quick either (Twitter threads, Facebook discussion, etc.), and many tools that have been built to enhance this process are cost-prohibitive.

But what if you could efficiently uncover hundreds of specific topics, long-tail queries, questions, and more that your audience is talking about, and you could do it in around 20 minutes of focused work? That would be sweet, right? Well, it can be done by using SF to crawl discussions that your audience is having online in forums, on blogs, Q&A sites, and more.

Still here? Good, let’s do this.

The process

Step 1 – Identifying targets

The first thing you’ll need to do is identify locations where your ideal audience is discussing topics related to your industry. While you may already have a good sense of where these places are, expanding your list or identifying sites that match well with specific segments of your audience can be very valuable. In order to complete this task, I’ll utilize Google’s Display Planner. For the purposes of this article, I’ll walk through this process for a pretend content-driven site in the Home and Garden vertical.

Please note, searches within Google or other search engines can also be a helpful part of this process, especially if you’re familiar with advanced operators and can identify platforms with obvious signatures that sites in your vertical often use for community areas. WordPress and vBulletin are examples of that.

Google’s Display Planner

Before getting started, I want to note I won’t be going deep on how to use the Display Planner for the sake of time, and because there are a number of resources covering the topic. I highly suggest some background reading if you’re not familiar with it, or at least do some brief hands-on experimenting.

I’ll start by looking for options in Google’s Display Planner by entering keywords related to my website and the topics of interest to my audience. I’ll use the single word “gardening.” In the screenshot below, I’ve selected “individual targeting ideas” from the menu mid-page, and then “sites.” This allows me to see specific sites the system believes match well with my targeting parameters.

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:qJyinA:Google Chrome.png

I’ll then select a top result to see a variety of information tied to the site, including demographics and main topics. Notice that I could refine my search results further by utilizing the filters on the left side of the screen under “Campaign Targeting.” For now, I’m happy with my results and won’t bother adjusting these.

Step 2 – Setting up Screaming Frog

Next, I’ll take the website URL and open it in Chrome.

Once on the site, I need to first confirm that there’s a portion of the site where discussion is taking place. Typically, you’ll be looking for forums, message boards, comment sections on articles or blog posts, etc. Essentially, any place where users are interacting can work, depending on your goals.

In this case, I’m in luck. My first target has a “Gardening Questions” section that’s essentially a message board.

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:f8grAc:Google Chrome.png

A quick look at a few of the thread names shows a variety of questions being asked and a good number of threads to work with. The specific parameters around this are up to you — just a simple judgment call.

Now for the fun part — time to fire up Screaming Frog!

I’ll utilize the “Custom Extraction” feature found here:

Configuration → Custom → Extraction

…within SF (you can find more details and broader use-case documentation set for this feature here). Utilizing Custom Extraction will allow me to grab specific text (or other elements) off of a set of pages.

Configuring extraction parameters

I’ll start by configuring the extraction parameters.

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:6CLiO7:SEOSpiderUI.png

In this shot I’ve opened the custom extraction settings and have set the first extractor to XPath. I need multiple extractors set up, because multiple thread titles on the same URL need to be grabbed. You can simply cut and paste the code into the next extractors — but be sure to update the number sequence (outlined in orange) at the end to avoid grabbing the same information over and over.

Notice as well, I’ve set the extraction type to “extract text.” This is typically the cleanest way to grab the information needed, although experimentation with the other options may be required if you’re having trouble getting the data you need.

Tip: As you work on this, you might find you need to grab different parts of the HTML than what you thought. This process of getting things dialed can take some trial-and-error (more on this below).

Grabbing Xpath code

To grab the actual extraction code we need (visible in the middle box above):

  1. Use Chrome
  2. Navigate to a URL with the content you want to capture
  3. Right-click on the text you’d like to grab and select “inspect” or “inspect element”

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:x5zaHV:Google Chrome.png

Make sure you see the text you want highlighted in the code view, then right-click and select “XPath” (you can use other options, but I recommend reviewing the SF documentation mentioned above first).

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:KGwqPz:Google Chrome.png

It’s worth noting that many times, when you’re trying to grab the XPath for the text you want, you’ll actually need to select the HTML element one level above the text selected in the front-end view of the website (step three above).

At this point, it’s not a bad idea to run a very brief test crawl to make sure the desired information is being pulled. To do this:

  1. Start the crawler on the URL of the page where the XPath information was copied from
  2. Stop the crawler after about 10–15 seconds and navigate to the “custom” tab of SF, set the filter to “extraction” (or something different if you adjusted naming in some way), and look for data in the extractor fields (scroll right). If this is done right, I’ll see the text I wanted to grab next to one of the first URLs crawled. Bingo.

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:fDZAyI:SEOSpiderUI.pngResolving extraction issues & controlling the crawl

Everything looks good in my example, on the surface. What you’ll likely notice, however, is that there are other URLs listed without extraction text. This can happen when the code is slightly different on certain pages, or SF moves on to other site sections. I have a few options to resolve this issue:

  1. Crawl other batches of pages separately walking through this same process, but with adjusted XPath code taken from one of the other URLs.
  2. Switch to using regex or another option besides XPath to help broaden parameters and potentially capture the information I’m after on other pages.
  3. Ignore the pages altogether and exclude them from the crawl.

In this situation, I’m going to exclude the pages I can’t pull information from based on my current settings and lock SF into the content we want. This may be another point of experimentation, but it doesn’t take much experience for you to get a feel for the direction you’ll want to go if the problem arises.

In order to lock SF to URLs I would like data from, I’ll use the “include” and “exclude” options under the “configuration” menu item. I’ll start with include options.

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:6scUuu:SEOSpiderUI.png

Here, I can configure SF to only crawl specific URLs on the site using regex. In this case, what’s needed is fairly simple — I just want to include anything in the /questions/ subfolder, which is where I originally found the content I want to scrape. One parameter is all that’s required, and it happens to match the example given within SF ☺:

The “excludes” are where things get slightly (but only slightly) trickier.

During the initial crawl, I took note of a number of URLs that SF was not extracting information from. In this instance, these pages are neatly tucked into various subfolders. This makes exclusion easy as long as I can find and appropriately define them.

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:fuqMmV:SEOSpiderUI.png

In order to cut these folders out, I’ll add the following lines to the exclude filter:

Upon further testing, I discovered I needed to exclude the following folders as well:

It’s worth noting that you don’t HAVE to work through this part of configuring SF to get the data you want. If SF is let loose, it will crawl everything within the start folder, which would also include the data I want. The refinements above are far more efficient from a crawl perspective and also lessen the chance I’ll be a pest to the site. It’s good to play nice.

Completed crawl & extraction example

Here’s how things look now that I’ve got the crawl dialed:

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:MjDfb8:SEOSpiderUI.png

Now I’m 99.9% good to go! The last crawl configuration is to reduce speed to avoid negatively impacting the website (or getting throttled). This can easily be done by going to Configuration → Speed and reducing the number of threads and URIs that can be crawled. I usually stick with something at or under 5 threads and 2 URIs.

Step 3 – Ideas for analyzing data

After the end goal is reached (run time, URIs crawled, etc.) it’s time to stop the crawl and move on to data analysis. There a number of ways to start breaking apart the information grabbed that can be helpful, but for now I’ll walk through one approach with a couple of variations.

Identifying popular words and phrases

My objective is to help generate content ideas and identify words and phrases that my target audience is using in a social setting. To do that, I’ll use a couple of simple tools to help me break apart my information:

The top two URLs perform text analysis, with some of you possibly already familiar with the basic word-cloud generating abilities of tagcrowd.com. Online-Utility won’t pump out pretty visuals, but it provides a helpful breakout of common 2- to 8-word phrases, as well as occurrence counts on individual words. There are many tools that perform these functions; find the ones you like best if these don’t work!

I’ll start with Tagcrowd.com.

Utilizing Tagcrowd for analysis

The first thing I need to do is export a .csv of the data scraped from SF and combine all the extractor data columns into one. I can then remove blank rows, and after that scrub my data a little. Typically, I remove things like:

  • Punctuation
  • Extra spaces (the Excel “trim” function often works well)
  • Odd characters

Now that I’ve got a clean data set free of extra characters and odd spaces, I’ll copy the column and paste it into a plain text editor to remove formatting. I often use the one online at editpad.org.

That leaves me with this:

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:AQjpqU:Google Chrome.png

In Editpad, you can easily copy your clean data and paste it into the entry box on Tagcrowd. Once you’ve done that, hit visualize and you’re there.

Tagcrowd.com

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:SeqYtU:Google Chrome.png

There are a few settings down below that can be edited in Tagcrowd, such as minimum word occurrence, similar word grouping, etc. I typically utilize a minimum word occurrence of 2, so that I have some level of frequency and cut out clutter, which I’ve used for this example. You may set a higher threshold depending on how many words you want to look at.

For my example, I’ve highlighted a few items in the cloud that are somewhat informational.

Clearly, there’s a fair amount of discussion around “flowers,” seeds,” and the words “identify” and “ID.” While I have no doubt my gardening sample site is already discussing most of these major topics such as flowers, seeds, and trees, perhaps they haven’t realized how common questions are around identification. This one item could lead to a world of new content ideas.

In my example, I didn’t crawl my sample site very deeply and thus my data was fairly limited. Deeper crawling will yield more interesting results, and you’ve likely realized already how in this example, crawling during various seasons could highlight topics and issues that are currently important to gardeners.

It’s also interesting that the word “please” shows up. Many would probably ignore this, but to me, it’s likely a subtle signal about the communication style of the target market I’m dealing with. This is polite and friendly language that I’m willing to bet would not show up on message boards and forums in many other verticals ☺. Often, the greatest insights besides understanding popular topics from this type of study are related to a better understanding of communication style, phrasing, and more that your audience uses. All of this information can help you craft your strategy for connection, content, and outreach.

Utilizing Online-Utility.org for analysis

Since I’ve already scrubbed and prepared my data for Tagcrowd, I can paste it into the Online-Utility entry box and hit “process text.”

After doing this, we ended up with this output:

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:F9LpWN:Google Chrome.png

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:mAMxCq:Google Chrome.png

There’s more information available, but for the sake of space, I’ve grabbed only a couple of shots to give you the idea of most of what you’ll see.

Notice in the first image, the phrases “identify this plant” & “what is this” both show up multiple times in the content I grabbed, further supporting the likelihood that content developed around plant identification is a good idea and something that seems to be in demand.

Utilizing Excel for analysis

Let’s take a quick look at one other method for analyzing my data.

One of the simplest ways to digest the information is in Excel. After scrubbing the data and combining it into one column, a simple A→Z sort, puts the information in a format that helps bring patterns to light.

ssd:private:var:folders:m2:wh1vdy452ps54mq15f_w0jlh0000gn:T:EXDvV1:Microsoft Excel.png

Here, I can see a list of specific questions ripe for content development! This type of information, combined with data from tools such as keywordtool.io, can help identify and capture long-tail search traffic and topics of interest that would otherwise be hidden.

Tip: Extracting information this way sets you up for very simple promotion opportunities. If you build great content that answers one of these questions, go share it back at the site you crawled! There’s nothing spammy about providing a good answer with a link to more information if the content you’ve developed is truly an asset.

It’s also worth noting that since this site was discovered through the Display Planner, I already have demographic information on the folks who are likely posting these questions. I could also do more research on who is interested in this brand (and likely posting this type of content) utilizing the powerful ad tools at Facebook.

This information allows me to quickly connect demographics with content ideas and keywords.

While intent has proven to be very powerful and will sometimes outweigh misaligned messaging, it’s always great to know as much about who you’re talking to and be able to cater messaging to them.

Wrapping it up

This is just the beginning and it’s important to understand that.

The real power of this process lies in its usage of simple, affordable, tools to gain information efficiently — making it accessible to many on your team, and an easy sell to those that hold the purse strings no matter your organization size. This process is affordable for mid-size and small businesses, and is far less likely to result in waiting on larger purchases for those at the enterprise level.

What information is gathered and how it is analyzed can vary wildly, even within my stated objective of generating content ideas. All of it can be right. The variations on this method are numerous and allow for creative problem solvers and thinkers to easily gather data that can bring them great insight into their audiences’ wants, needs, psychographics, demographics, and more.

Be creative and happy crawling!

5 Ways for Job-Seeking Millennials to Clean Up Their Social Media Profiles Today

by Christie Carton and first published on Recruiter.com

Graduation has come and gone. If you’re like so many young people today who were unable to secure professional employment in the field of their choice before leaving college, you’re likely still hunting for those ideal job postings, submitting applications, and going on as many interviews as possible.

Resume in order? Check. Networking events attended? Check. Social media accounts cleaned up? Hmm.

If you haven’t done so already, you might want to seriously rethink what you’ve put out into the social media universe as well. This, believe it or not, is a critical part of the job search.

A recent survey conducted by my nonprofit, the 1,000 Dreams Fund, via Toluna Quicksurveys found that half of job seekers polled between the ages of 18 and 25 don’t plan to clean up their social media profiles before applying for jobs. This is a big mistake, especially given that employers say they use social media to screen and possibly eliminate candidates, according to another recent survey.

The bottom line is this: Don’t let some social media goof overpower your stellar application and prevent you from becoming the next promising employee at the company of your dreams!

Here are five tried-and-true tips from other successful grads about cleaning up your social media profile during the all-important job hunt!

1. Google Yourself

Search yourself to see what comes up. Be sure to dig deep and see what each page contains. What you see may surprise you – and it’s the quickest way for you to gauge what employers are seeing.

2. Keep It Private!

Depending on what you find during your Google search, it may be a good idea to make your Facebook profile private so that only those in your network of friends can see all the fun you had in school.

3. Delete, Delete, Delete!

Your employer can access pretty much anything online. If you wouldn’t want them to see a specific post, tweet, or picture, delete it. If you find something on a third-party site you don’t want out there, reach out to the publisher or editor to see if they’ll remove the post. In most cases, they will, especially if you are clear that it could impact your ability to find a job.

4. Keep it PG

Getting ready to post an update, or maybe a pic from that girls’ night out? If it’s something you wouldn’t want your teenage cousin or grandmother to see, you should probably reconsider! At the end of the day, there’s no way to gauge who is looking at your pictures or posts, so you should be sure to avoid posting anything controversial.

5. Leave It to the Pros

Cleaning up your social media presence can be a time-consuming process, so it’s important to know that there are professional “scrubbing” services you can lean on. These services are especially useful when you’re dealing with something that’s hard to remove, because they pride themselves on cleaning up messy digital footprints.

Social-media-INFOGRAPHIC (1)

Christie Garton is an award-winning social entrepreneur, author, and creator of the 1,000 Dreams Fund.

Google: Short Articles Won’t Penalize Your Site; Think About Users

by Barry Schwartz and first published on Search Engine Roundtable.

Google’s John Mueller covered lots and lots of myths this past Friday in the Google Hangout on Google+. He said at the 34:37 minute mark that having short articles won’t give you a Google penalty. He also said that even some long articles can be confusing for users. He said that short articles can be great and long articles can be great – it is about your users, not search engines.

The question posed was:

My SEO agency told me that the longer the article I write, the more engaged the user should be or the Google will penalize me for this. I fear writing longer articles with lots of rich media inside because of this, is my SEO agency correct or not?

Back in 2012, Google said short articles can rank well and then again in 2014 said short articles are not low quality. John said in 2016:

So I really wouldn’t focus so much on the length of your article but rather making sure that you’re actually providing something useful and compelling for the user. And sometimes that means a short article is fine, sometimes that means a long article with lots of information is fine.So that’s something that you essentially need to work out between you and your users.

From our point of view we don’t have an algorithm that council words on your page and says, oh everything until a hundred words is bad everything between 200 and 500 is fine and over 500 needs to have five pictures. We don’t look at it like that.

We try to look at the pages overall and make sure that this is really a compelling and relevant search results to users. And if that’s the case then that’s perfectly fine. If that’s long or short or lots of images or not, that’s essentially up to you.

Sometimes I think long articles can be a bit long winding and my might lose people along the way. But sometimes it’s really important to have a long article with all of the detailed information there. That’s really something that maybe it’s worth double checking with your user is doing some a/b testing with them. Maybe getting their feedback in other ways are like sometimes you can put like the stars on the page do you have a review that or use maybe Google consumer surveys to get a quick kind of a sample of how your users are reacting to that content. But that’s really something between you and your users and not between you and and Google search engine from that point of view.

I specifically did the Google Consumer Surveys approach when I was hit by the Panda 4.1 update, which I recovered from on Panda 4.2. I even published my results for all to seeover here and it showed, people, my readers, like my short content.

So it really isn’t about how short, tall, long or detailed you are. As long as the content satisfies the user, Google should be satisfied too.

Here is the video embed:

Google Search Analytics Adds Compare Queries Option

by  and first published on Search Engine Roundtable.

The Google Search Analytics feature in the Google Search Console has added a new feature that let’s you compare two different queries. You can access this by going to the Google Search Console, clicking on Search Analytics and then clicking on the Queries option and selecting “compare queries.”

This was first spotted by Jonathan Jones and posted on Twitter.

This let’s you basically compare two search phrases against each other.

Here is where you find the option:

Then this box pops up:

Then you enter the two phrases you want to compare:

Here is then the output that shows you those two queries:

click for full size

 

Google penalizes sites for unnatural outbound linking

By Barry Schwartz and first published on Search Engine Land.

google-link-warnings-seo

Check your Google Search Console message center. Google sent out outbound link penalties over the weekend.

Over the weekend (April 9-10, 2016), Google issued many manual actions for “unnatural outbound links.” This is a penalty issued by the Google manual actions team, specifically over sites linking out to other sites in an effort to manipulate the Google search results. In this case, it seems Google penalized the site by deciding not to trust any of the links on the website.

The email sent to these webmasters read:

If you see this message on the Manual Actions page, it means that Google has detected a pattern of unnatural artificial, deceptive, or manipulative outbound links. Buying links or participating in link schemes in order to manipulate PageRank is a violation of Google’s Webmaster Guidelines.

As a result, Google has applied a manual spam action to the affected portions of your site. Actions that affect your whole site are listed under Site-wide matches. Actions that affect only part of your site and/or some incoming links to your site are listed under Partial matches.

Here is a picture from one of the many complaints about this manual action in the Google support forums:

google-unnatural-links-outbound-1460374556

 

You should log into your Google Search Console account and check your all messages box to see if you have this notification or any others. If you were hit by the outbound link penalties, there are instructions on how to fix them over here.

I have only seen a few inbound link notification penalties this weekend. So it is too early to tell if sites on the other end of this penalty got it. If anything, it seems Google is no longer trusting the links from these sites, which can have a negative ranking impact on the sites receiving these links. But that would not be a direct penalty for inbound links.

Google has not commented about this penalty.

UPDATE 4-12-16: Source: Search Engine Land

The mass Google manual actions for outbound links was related to the warning Google gave a few weeks ago around bloggers giving links in exchange for free products or services.

Yesterday, Search Engine Land reported that Google sent out outbound linking penalties to a mass number of webmasters over the weekend. It turned out that this was directly related to the warning from Google a few weeks ago for bloggers to disclose free product reviews as such and nofollow links in their blog posts over these product reviews.

Google told bloggers to “nofollow the link, if you decide to link to the company’s site, the company’s social media accounts, an online merchant’s page that sells the product, a review service’s page featuring reviews of the product or the company’s mobile app in an app store.”

Well, now that the warning was communicated, a few weeks later, in typical Google style, Google actually sent out manual actions for those who did not comply with those guidelines.

John Mueller from Google commented in a few threads in the Google support forums telling people to look at the warning Google published a few weeks ago named Best practices for bloggers reviewing free products they receive from companies. He added:

In particular, if a post was made because of a free product (or free service, or just paid, etc.), then any links placed there because of that need to have a rel=nofollow attached to them. This includes links to the product itself, any sales pages (such as on Amazon), affiliate links, social media profiles, etc. that are associated with that post. Additionally, I imagine your readers would also appreciate it if those posts were labeled appropriately. It’s fine to keep these kinds of posts up, sometimes there’s a lot of useful information in them! However, the links in those posts specifically need to be modified so that they don’t pass PageRank (by using the rel=nofollow).

Once these links are cleaned up appropriately, feel free to submit a reconsideration request, so that the webspam team can double-check and remove the manual action.

Google says “orchestration” is next big trend in the Internet of Things

By Jennifer Elias, Technology Reporter, Silicon Valley Business Journal – First published in the San Antonia Business Journal.

The Internet of Things has officially advanced past the “hype” stage, according to conference panelists including a Google developer.

At the Bluetooth World conference in Santa Clara, California, Wayne Piekarski, a senior developer advocate for Google, said he believes the next big push in the Internet of Things is “orchestration,” meaning multiple connected devices aware of each other working together.
Google says orchestration the next big trend in Internet of Things.

Photo Credit: Tony Avelar/Bloomberg

“When you walk in your home, the lights come on and coffee machine goes on,” Piekarski said. “People don’t want to control a single light bulb, they’re going to work with multiple devices, which means working with multiple manufacturers.”

Currently, users of connected devices have to buy one product and download its app. But Piekarski said the Mountain View company wants to decouple that “so that some app developer can write the app but the device doesn’t have to be made by the same company.”

According to Piekarski, that’s why the company is working with smart home platforms such as Brillo and Weave.

Google product strategy manager Scott Jenson spoke in a separate panel about Google’s efforts toward “The Physical Web” which is an open-source effort to enable all connected devices to work in unison without the need for separate apps. Jenson said it is one of the company’s most popular projects on open-source collaboration platform Github.
As demand increases for connected devices, consumers will want to add more features such as voice commands, he added. “There’s all these integrations you can do once people get a feel for the Internet of Things.”

Sales of connected devices are climbing but Piekarski said consumers should be wary about some of the products on the market. “There are IoT devices gradually being put out there but they’re not done well,” he said, adding “I think what we’re going to see is a security conflict.”

Piekarski said he expects a security breach before IoT manufacturers start taking security seriously. “Currently, they’re [manufacturers] sending packets over the network un-encrypted and we haven’t had anything bad happen yet but something’s going to happen where someone’s house catches on fire,” he warned.
“People need to trust them and if you’re going to put it [connected device] on a door, it has to work every time and it has to let you into the house every time–it can’t lock you out. The challenge is going to be doing it really well and that’s what we’re trying to do right now–to make it so that people can trust these things.”

Adnan Nishat, another panelist and senior product manager at Silicon Labs, agreed with Piekarski, saying when a high-profile security attack happens, it doesn’t just affect one brand “but it weakens consumer trust which impacts everyone … We need to take a systematic approach to implementing security, not just about turning a feature on and off, but planning holistically end-to-end.”

A Design Workflow Tutorial for Developers: Deliver Better UI/UX On Time

BY LUBOS VOLKOV – LEAD PRODUCT DESIGNER @ TOPTAL

Working with a great designer or design team can be an invaluable asset to any team. With clear communication channels, and free-flowing co-operation, the designer should give you everything you need to speed up the building process and limit questions and confusion as much as possible.

What can you, the UX developer, do to ensure that the product you have built is delivered in a timely manner without sacrificing the quality of the user interface and user experience?

Call up your designer; it’s time to streamline UI design workflow.

My answer: Get your designers involved from day one, and keep them involved throughout the entire UI/UX development process. Make sure to establish clear communication lines and consistent messaging between developers and the designers.

Do You Have Everything You Need?

The worst thing that can happen during the implementation o any UI is lack of communication between the designer and the developer(unless they’re the same person). Some designers think their job is done once the PSD is sent over. But, that’s just wrong! You must create an always-on communication workflow that lasts beyond the delivery of the PSDs.

Projects where the designer just submits the design files, and the developer just implements them, are the projects that just fail.

In many cases, it will take time before the designers see the actual UI/UX design implementation and, to their surprise, the build is completely different from the initial submission. (This happened to me more than once. I have sent over source files with complete descriptions and interaction prototypes, but when I finally saw the project, months later, it had different layout, different colors, and no interactions in place.)

Some designers might hate me for this, as this design workflow requires a lot of “extra” work on their side. However, creating and delivering full assets and information, in an organized way, is better for the project and the team as a whole.

Having everything a developer needs, in front of him, will speed up the process. A clean PSD is just not enough.

What do you need to get the job done effectively and efficiently?

These are the assets that a developer should expect from the designer to bring a UI/UX design to implementation:

  • Resource file – Designer should place every element of the app in one file. This file should contain buttons, checkboxes, header styles, fonts, colors, etc. Basically, based on the information in this file, developer should be able to recreate any interface from scratch. It’s much easier for a developer to export any element from a single PSD, than to search multiple files for it.
  • Assets – Make sure that developers get all the required assets, as source files should not be touched any more.
  • Interaction prototypes – Days of “static screens” are long gone. Using smart interactions and animations, to smooth-out UX design workflow and implementation, is a common practice now. But, you can’t just say “this will slide in from the left ” to a developer. Designer should create actual prototype of that interaction. Prototype should include information like speed, velocity, etc., and the designer is expected to specify each of these values.
  • Naming convention – Request a file naming structure to keep things organized. It’ll make it easier for both of you to navigate files. (No one likes to have things hidden in a background folder).
  • HDPI Resources – We live in the “hard times”, with the huge density of the screens. Make sure that designer will deliver images in all of the required resolutions, so your application will look crispy everywhere. Note: use as much vectors as possible, it’s going to help you a lot (svg).

If you do find something else missing during the implementation, don’t be afraid, ping the designer and ask for it. Never skip, and never skimp! You are members of the same team, and your job is to deliver the best possible product. If a designer fails, you fail as well.

Work In-Progress

Utilize your designers during the UI/UX development process. Don’t keep them in the sidelines expecting them to just “push the pixels”. A designer sees possible innovations even before the implementation starts. To take the advantage of this, keep them in the loop. Provide them with access to see, and test, the work in progress. I’m well aware that no one likes to share unfinished projects. But, it is much easier to make changes in the middle of a build than at the end. Doing so may save you time and prevent unnecessary work. Once you give the designer a chance to test the project, ask him to compile a list of problems and solutions, and suggest improvements.

What to do when a developer has an idea that would change the look of an application? Discuss it with thedesigner, and never allow a developer to modify the design, without consulting the designer. This design workflow will assure that the build stays on track. A great designer has a reason for every element on the screen. Taking a single piece out, without understanding why it’s there, could ruin the user experience of the product.

UI/UX Design Project Management

Designers think that developers can bring a design to life in one day, or even in one hour. But, like great design, great development takes time and effort. Keep your anxious designer at bay by letting him see the progress of the build. Using external project management software, to make sure every revision is accounted for, is a great way to make sure you don’t miss important information discussed in an email conversation or a Skype session. And let’s be honest: sometimes changes and activities aren’t even communicated until they happen.

Whatever solution you use, be sure to choose one workflow process that the whole team will adopt and consistently use. On our team, I tried to push Basecamp because that’s what I was using, but our front-end developers thought it had limited features. They were already using other project management software to track bugs, progress, etc., such as JIRA, GitHub, and even Evernote. I understood that project tracking and management should be kept as simple as possible, so I migrated my UI design workflow to JIRA. I wanted make sure they understood my workflow and progress, but I did not want them to feel like design was another thing to manage.

Here are few suggestions for a project management tool:

  • Basecamp – Tracks the progress of the design and development related tasks, and easily lets you export tasks. It also has a simple mobile client.
  • JIRA – A fully customizable platform where you can easily set up custom boards for different areas. For example, organize boards to track activities such as back-end, front-end, design, etc. I think the mobile client is a bit weak, but it is a great solution for bigger teams and includes a bug tracking feature.
  • Email – This is great for setting up a conversation or sending images. But please be carefull if you use email for feedback. Things can easily get lost.

You can also try Trello and other project management software, but the most widely used in our industry are Basecamp and JIRA. Again, the most important thing is to find a project management system that everyone can use on a consistent basis, otherwise it’s a moot point.

Lubos Volkov is an experienced designer who has worked remotely with numerous developers throughout his career. As the product designer at Toptal, Lubos interacts daily with team members from a variety of departments including engineering, community, and content. He is a talented designer whose communication skills contribute to his success. In this tutorial, Lubos shares his experiences and ways to optimize designer-developer UI and UX workflows that lead to quality products delivered on, or before, deadline.