Tag

SEO

Browsing

Facebook is now the most popular places that advertisers are putting their video ads, even beating YouTube.

By MediaStreet Staff Writers

Top marketers know that digital video is one of the most powerful tools to increase consumer engagement and brand loyalty. In fact, according to a new study from Clinch, brand marketers are ramping up their production of digital videos with an emphasis on creating campaigns specifically for Facebook and YouTube.

The study found that 78 percent of marketers plan to increase their production of video ads in 2018, while only 43 percent of marketers plan to increase their production of static banner ads this year.

Social is Video

When it comes to digital video campaigns, Facebook reigns supreme, representing 46 percent of all video ads produced. When adding Facebook-owned Instagram into the mix, this number leaps to 74 percent. YouTube comes in a close second at 41 percent.

Says Oz Etzioni, CEO of Clinch, “It’s no secret that Facebook and YouTube dominate the digital media landscape and we don’t expect this to slow down, particularly with the Facebook algorithm change which requires brands to pay in order to be seen. In 2018 brands will increase spend and leverage the rich data that these platforms provide. However, the data and platform are just two pieces of the puzzle. Creative is the critical third piece. If brands aren’t uniquely tailoring their creative specifically for each platform and by audience, opportunities will be missed and ROI will be lowered.”

Nearly three quarters of marketers are adopting online video from their TV commercials. 44 percent indicated that they don’t shorten commercials for each platform’s suggested length. While TV ads remain a critical source of video content, the user experience of each social platform is very different than traditional TV. For example, TV ads are 15 to 30 seconds long but Facebook and YouTube recommend six-second videos.

Etzioni continued, “We were really surprised to learn that marketers were taking a one size fits all approach to video. In 2018, marketers will awaken to the fact that investment in creative will increase ROI and personalisation at scale, and will become the norm for digital video as it has become for static ads.”

Defining Social Personalisation

While 50 percent of respondents say they personalise their video campaigns, brands can be doing a lot more. Those that are personalising their creatives based on data are seeing big results. Nearly 90 percent of respondents who have customised Facebook or YouTube video ads reported seeing benefits. Furthermore, 70 percent of those who customise said that they have seen improvements in their key performance indicators (KPIs).

According to Etzioni, in the next few months, the definition of personalisation will change. “Rather than creating a handful of versions – one for men, one for women, one for the East Coast and one for the West Coast, we expect brands to be using data insights to personalise at scale. This means hundreds if not thousands of versions of videos where the message and creative is tailored to their specific needs and interests. This will create a more meaningful experience for the consumer and transform video campaigns from simply brand awareness to direct response opportunities,”

The full report, “How Leading Brand Marketers are Using Personalised Video to Drive Sales,” is available for download here.

 

 

Do you supply services to the Irish Media Industry? Have you listed your company in our Media Directory? It’s free! Everyone’s favourite price! Click here to do it now.

By Tylor Hermanson

Are keywords still important for search engine optimization (SEO)?

Do keywords even matter to Google anymore?

The short answer: Absolutely.

The longer answer: Keep reading.

Table of Contents

What are SEO Keywords?

SEO keywords range from singular words to complex phrases and are used in website copy to attract relevant, organic search traffic. However, keyword integration is just the start. When properly leveraged, targeted SEO keywords should be used to inspire all page content in order to satisfy searcher intent.

From a searcher’s perspective, keywords are the terms typed or spoken into a search engine. When effectively researched and optimized, keywords act as a conduit for your target audience to find the most appropriate content on your website.

google search query for "what are keywords in seo"

But Aren’t Keywords Obsolete?

Whether you’ve heard this a few times already or your first is yet to come, “Keywords are dead” is a phrase which continues to barge its way into SEO circles. Rather than tip-toe around this recurring, binary, often-click-bait motivated assertion, let’s confront it head on.

Several developments in the SEO world have caused this claim to be stirred from hibernation, but there are four major ones that come to mind.

1. “(not provided)”

If you’re brand new to SEO, you may be surprised to know organic keywords were once easily accessible in Google Analytics, Adobe Omniture, or any other analytics platform.

I’m not going to lie; it was pretty fantastic. We didn’t know how good we had it at the time.

However, things started changing in 2010 when Google began quietly taking steps to remove keyword data from our web analytics. In late 2011 through the following year, keyword data was being removed in a big way. It wouldn’t take long for the top keyword driver for every site to be ‘(not provided)’.

not provided example from google analytics

Once we lost our keyword data and were seemingly flying blind, many were quick to write the obituary for keywords.

But what really was different? After all, people were still searching the same and Google hadn’t changed how it was interpreting our content. We just had less visibility.

We’ve all heard, “If a tree falls in a forest and no one is around to hear it, does it make a sound?” This is the same thing. Nothing was different; we just weren’t around.

Bottom line: Keywords aren’t dead. The old way of tracking them is.

2. Hummingbird & RankBrain

Another time the validity of keywords was challenged was when Google rebuilt its algorithm in 2013. Receiving its name for being fast and precise, Hummingbird helped Google better understand search intent, particularly with complex and conversational searches.

In 2015, Google incorporated the AI-driven ranking factor, RankBrain, into the mix to further improve its query interpretation abilities.

Before, a search for “what pizza places near me deliver?” would send Google off looking for content that matches those terms. Now, Google uses these keywords as contextual signals to learn what we really want and often rewrites our query behind the scenes (e.g., “pizza delivery 66062”).

Knowing Google often rewrites our search queries may make it seem like their usefulness is all but obsolete. But really, Google just got smarter with what we provided.

Here’s another perspective. Have you ever heard the statistic that only 7 percent of communication is through words alone? This was derived from a popular study in the late 1960s and is often used to boost the stature of nonverbal communion, diminishing that which is verbal.

Here’s a challenge for you:

Go through your entire day tomorrow without using words – no typing, saying, or signing them. At the end of the day, let me know if you felt your communication was 93 percent as effective as it normally is.

I think you can probably predict the outcome.

It’s not that the stat is wrong. There is so much more to communication (and search) than words. It is, however, often misunderstood.

The 7 percent speaks more to quantity than importance. We need that 7 percent, and we need keywords.

Bottom line: Keywords aren’t dead. Google’s former way of interpreting them is.

3. Voice Search

I love voice search. Even though it’s been around for years, I still feel like I’m in the future when Google magically captures my unintelligible stammering.

As voice search grew from being an occasionally-used novelty to a staple in our search behavior, many wondered what that meant for keywords. We all knew voice search impacted keywords, but did it kill them?

We’ve Become Long-Winded

Between us (subconsciously) picking up on Google’s heightened interpretation skills and our communication tendencies when talking versus typing, we have become very conversational and detailed searchers.

In the old days, if we wanted to know who Brad Pitt’s first wife was, we would translate our thoughts into a search-friendly query, like “Brad Pitt’s wives”. Now, we simply tell Google what we want: “Who was Brad Pitt’s first wife?”. This is one of the main reasons why 15 percent of searches have never been heard of before by Google every single day.

So, while it’s been a huge win for searchers, it’s posed challenges to SEO professionals. For instance, it’s hard to know which keywords to keep an eye on if a significant chunk of traffic is driven by those that had rarely, if ever, been searched before.

But this goes back to the “(not provided)” argument. Just because our tracking is imperfect doesn’t mean the significance of keywords lessens in any way.

We Omit Important Keywords

Did you know through voice search you can find out when Scarlett Johansson’s first album was released from a query that doesn’t include her name or the name of her album? (Side note: Did you know Scarlett Johansson had an album?)

Google understands context matters, not only within a search, but between strings of them as well.

So, do keywords actually matter if you can leave out crucial bits and still get what you want? Of course! This just forces us to step back and look at the bigger picture, rather than examine each individual search in a vacuum.

Bottom line: Keywords aren’t dead. Typing as our only way to search them is.

4. Google Planner Grouped Keyword Volumes

Starting in 2014 and kicking things up a notch two years later, Google’s Keyword Planner tool began grouping volumes for similar terms. Instead of showing keyword A gets searched 100 times per month and keyword A1 gets searched 50 times per month, both would show 150. Google said the reason for this to make sure “you don’t miss out on potential customers” and to “maximize the potential for your ads to show on relevant searches.”

That explanation certainly implies searcher intent doesn’t vary much between closely related terms.

The move seemed to reinforce the notion that topics, not keywords, are all SEO professionals need to worry about. However, this doesn’t explain why Google search will often significantly shake up its results for keywords that Google Keyword Planner deems synonymous enough to lump together.

Ultimately, Keyword Planner is a PPC tool. You don’t have to be a conspiracy theorist to understand how forcing PPC bidders to expand their keyword targeting could be a financially-motivated decision.

Bottom line: Keywords aren’t dead. But Google’s keyword metrics might as well be.

Why are Keywords so Important to SEO?

We know keywords are alive and well, but why are they so critical to SEO?

Keywords are Clues

The importance of keywords in SEO is in part due to their importance outside of it.

Forget about keywords, rankings, traffic, or even your website for a minute.

If you knew your customers’ true feelings, how would you operate your business differently? How valuable would those insights be to you?

In his book, “Everybody Lies”, Seth Stephens-Davidowitz shares his findings of what search behavior tells about human psychology. When in a focus group, taking a survey or responding to something on Twitter, we all tend to let our answers be impacted by how others may perceive them.

What about when we’re searching? The combination of anonymity and immediate access to a wealth of information paves the way for an unadulterated look into what we truly want.

It’s data-driven truth serum.

At its core, keyword research is a powerful market research tool that can be leveraged in many different ways, not just informing website content. To get the most out of keywords, you have to look beyond the explicit, literal translation and also pick up on the implicit clues to gain the true intent of each keyword.

As an example, let’s look at the query, “safest baby cribs”.

safest baby cribs 2017
Explicit information Implicit information
concerned about safety likely first-time parents
wants more than one crib to choose from wants to know what makes cribs safe/unsafe
looking for article published in 2017 understands safety standards change over time
in research phase with future intent to buy
possibly in process of buying other items for nursery
safety may be more important than cost or aesthetics
likely looking for a list of cribs ranked by safety measure

Keywords are Like Personas

Personas act as bullseyes. They aren’t all we’re after but by aiming for them, we’re setting ourselves up for success.

Cover Your Entire Backlink Workflow
Analyze, clean, earn, and track backlinks from one tab. Manage your link profile easily with SEMrush!.

ADVERTISEMENT

It’s not as if I only want to market to 54-year old women named Betty who have a 401k and are soon to be empty nesters. But that level of granularity and focus helps ensure I’m attracting the right group of people.

Conversely, if you have no focus and try to appeal to everyone, you will likely come away empty-handed. It’s a beautiful paradox, really – the exclusivity of your target audience often is directly related to the size of your actual audience, and vice versa.

funny seo persona

It’s the same with keywords. A quick peek into Google Search Console’s search query data will tell you it’s never just about one keyword. However, having a primary keyword target for each page will give you the right direction and perspective to capture the right audience from a plethora of related searches.

How do You Choose the Right Keywords?

This topic could live in a post on its own, which it has many, many times. Here are some of my recent favorites:

While I highly suggest researching and experimenting with this topic in great detail if you’re serious about honing your craft, here’s a quick introduction to selecting the best keywords for SEO.

  • Don’t start with keywords: Before you put on your SEO hat or even your marketing hat, just be human. Learn about your customers from your customers. Before diving into tools and spreadsheets, try to gain some real empathy and understanding for the customers you’re serving and the perspectives they hold.
  • Build a seed list: Using what you gained in step one, along with what you know about where your customers’ needs and your business’ solutions intersect, brainstorm an initial list of words and phrases that effectively describe your core offerings.
  • Gather current keyword data (if your site already exists): Generate a list of what is currently (and nearly) driving traffic to your site using Google Search Console click data and any ranking data you have.
  • Expand the list using various keyword tools: Expand on the list you’ve built from steps 1-3 by looking for new keyword groups, alternate phrases, common modifiers and long-tail permutations. If you haven’t used many keyword research tools up to this point, now’s your time.
  • Group terms by search intent: Categorize your keywords in a way that will be simple and useful for you and anyone else who might look through them. This can be done by audience-type, phase of the funnel, or any other way that makes sense to you.
  • Map keywords to content: Choose 1-4 primary keywords to target on each page based on a careful balance between keyword difficulty, relevance, and search volume (taking organic SERP visibility into account). Once those are determined, find semantically-related and long-tail modifying terms to help support your primary keywords.
  • Do it all over again: Once your keyword strategy has been implemented, Google has had time to react and you’ve been able to collect enough data, rinse and repeat. They don’t call it search engine optimization for nothing.

What are the Most Common SEO Keyword Types?

Keywords can be categorized and tagged in multiple ways for a variety of reasons. Here are the most common types and examples of SEO keywords.

Branded vs. Unbranded

Branded search terms contain the brand in the query. This could include the official brand names, misspellings, branded acronyms, branded campaign names or taglines, parent companies, or anything else with obvious branded search intent.

Unbranded, or non-branded, terms are all other keywords you may consider. Unbranded terms often describe the customer problem or your business offering.

Some businesses have non-distinct names that can make this delineation more difficult. For instance, is a search for “Kansas City Zoo” branded or unbranded when the name of the zoo is… Kansas City Zoo?

Branded terms generally bring in the highest converting traffic because the searcher already has a certain level of brand familiarity and (often) affinity.

Examples:

  • Branded: Houston Rockets
  • Unbranded: the unequivocal greatest basketball organization of all time

Seed vs. Page-specific Keywords

Seed words are the obvious, initial list of words you start with in the keyword research process. They act as the seeds you “plant” to grow your list.

Seed words are often relevant to most of your website, if not all of it. Page-specific keywords are generally found later in the keyword research process and are applicable to only a single page or set of pages.

Examples for Home Depot:

  • Seed: home improvement store
  • Page-specific: deck building supplies

Long-tail vs. Head Terms

Those with the highest search demand are called head terms. Conversely, those with a relatively low demand are considered long-tail.

Why? When you graph them out, head terms fall off quickly in terms of the total number of keywords, whereas lesser searched terms seem to go on forever like a tail.

The middle of the graph is often aptly named “middle” or “chunky middle” (or torso). With 15 percent of searches being new to Google each day, it shouldn’t be surprising that most search queries are considered long-tail, even if each individual long-tail query gets searched very few times.

dinosaur with long tail

Head terms and long-tail terms tend to have the following contrasting characteristics. However, besides volume, none of these are absolute.

Head Long-tail
High search volume Low search volume
High ranking competition Low ranking competition
Low converting traffic High converting traffic
Few words Many words
Best for top-level pages Best for lower-level pages
Multiple search intents Singular search intent


Examples:

  • Head: Bob Dylan
  • Long-tail: Who is Jakob Dylan’s father?

Primary vs. Secondary Keywords

Also labeled “targeted” or “focus”, primary keywords are used to describe your most important keywords. These terms can be used in the context of your entire site or a single page.

Secondary (also called “tertiary” or “supporting”) keywords include all other keywords you are targeting and/or incorporating. In some contexts, secondary terms are those you are loosely optimizing for, but they’re just not considered a high priority. In other scenarios, secondary keywords act as the semantic or long-tail support to help you get the most out of your primary keyword targeting.

Examples for a subscription shaving kit product page:

  • Primary: shaving kit subscription
  • Secondary: monthly, razors, free trial, custom

Step, Stage, or Phase

SEOs often recommend categorizing your keywords according to a marketing funnel or customer journey. This can help ensure you are targeting customers at each critical point.

Some sets of categories have the brand in the center (e.g., awareness, consideration, conversion, retention) while others are more customer-centric (e.g., unaware, problem aware, solution aware, brand aware). Similarly, some simply determine the action-oriented mindset of the consumer (e.g., navigational, informational, transactional).

Examples:

  • Awareness: 30th birthday party ideas
  • Consideration: Las Vegas travel reviews
  • Conversion: flight and hotel packages to Las Vegas
  • Retention: Mandalay Bay loyalty program

Local vs. Global Keywords

Depending on its usage, a local keyword can mean one of two things:

  1. The searcher is looking for something geographically nearby: This can be very straightforward like “library near me” or “2-bedroom rentals in Phoenix”, or it could be more subtle like “restaurants” or “What time does Whataburger close?”.
  2. The searcher has a high probability of being in a certain area: For instance, “Why did Oklahoma Joe’s change their name?” could be considered a local term because there’s a good chance the searcher is from Kansas or Missouri. Why? Those are the only two states where this exceptional barbecue establishment calls home. By the way, it is now called Joe’s Kansas City BBQ if you ever happen to be coming through town.

Examples:

  • Local: 2-bedroom rentals in Phoenix
  • Global: Is renters insurance worth it?

Audience Type

Rarely does someone self-identify themselves in a search.

When’s the last time you started a search with “I’m an XX year-old, college-educated digital marketer looking for [rest of your search]”? I’m going to go out on a limb and guess this has never happened.

However, the ‘who’ behind the searcher can often be found in the implicit information of the query.

While almost no queries are exclusively searched by one group, many heavily skew towards a single audience.

One of the best ways to find out who is searching for a term is Google it and look at the results. Then ask yourself who the top results seem to be talking to.

If Google’s job is to give a searcher what they want, then the target audience for the top results of a query should be the same audience who completed the query.

Examples:

Patient: Is diabetes hereditary?
Doctor: T2DM treatment algorithm

Evergreen vs. Topical

Evergreen keywords have steady search volume with little variance over time. On the other hand, topical keywords are either seasonal (e.g., valentine’s day gift ideas), flashes in the pan (e.g., covfefe), or consistently relevant (e.g., Taylor Swift).

Some evergreen keywords can switch to being topical when an event makes them culturally relevant, like searches for a celebrity immediately after their unexpected death or a city when it’s hosting the World Cup. Google often favors new content for topical keywords because the “query deserves freshness”.

People like to create evergreen content because it can be a low investment relative to the long-term value it produces. However, the competition and initial cost are often steep. Conversely, topical content is attractive because it has a lower cost of entry, weaker competition, and provides immediate value – but that value has a short shelf life.

Examples:

  • Evergreen: how to know if you’re pregnant
  • Topical: movie showtimes this weekend

topical keyword google trends

Keywords vs. Carewords

For the first time since we moved in four years ago, we decided to pay to get our house cleaned. Our searches were very much based in logic:

  • What’s the cost for how much work?
  • Do they use natural products?
  • Did they get good reviews?
  • Are they flexible on timing?

However, how the companies made us feel certainly played a key role, even if it was mostly subconscious. In this instance, content that made me reflect on all the time I was going to save, how this would be one less thing I had to stress about, even the smell of a fresh house when I walked in the door – likely played a role in my final decision.

We search with our Neocortex but our reptilian and paleopallium brains often make the decisions.

Sara Howard describes carewords using an example of buying a car. Would you include “reliable warranty” in a search for a new vehicle? Probably not. Do you want to know the warranty is reliable once you’re on the page? Absolutely.

In short, carewords are low-to-no-traffic-generating terms that increase on-site engagement and conversions for existing traffic.

Examples:

  • Keywords: wet bar ideas for basement
  • Care words: wine enthusiast, ample storage, simple, hosting, durable, man cave

How do You Optimize Your Website for Keywords?

Much like choosing keywords, effectively optimizing your website for keywords could live on its own blog post. However, here are a few tips to get started.

Where to Incorporate Keywords on a Webpage

  • URLs: URLs rarely change, are highly visible and describe the entire page. For those reasons, Google places some value in what they say.
  • Static content: Search engines are getting much better at crawling dynamic content. Static content is a near-guarantee for indexation.
  • Title tags: Title tags influence rankings and click-through-rate (CTR) and if written effectively, keywords can help with both.
  • Meta description tags: Unlike title tags, meta descriptions do not influence rankings in Google. However, including them can increase CTR.
  • Most visible content: Google’s job is to understand content the way we do. An H1 tag at the top of the page gets far more eyeballs than boilerplate content at the bottom. Whether it’s a heading tag, early body copy or a bolded phrase, the most visible content is generally the most influential for SEO.
  • Internal links and surrounding content: Incorporating keywords into the anchor text of links pointing to your page from others on the site helps show Google what your page is about. Similarly, content nearby anchor text pointing to your page is also observed by Google and, to a lesser degree, is used to describe the destination page.
  • Image and video file names: Instead of letting your phone give your image or video a default name that usually contains something random and nonsensical, give it a descriptive name using a relevant keyword.
  • Image alt attributes: Alt tags not only make your site more inclusive for your visually impaired audience, they give Google a better idea of your picture. Incorporate keywords when appropriate.
  • Image title attributes: Image titles don’t work on all browsers the same way, which is why Google may not put much weight into this content. However, if there is an opportunity to gracefully include keywords, go for it.
  • ARIA tags: ARIA tags are similar to alt attributes in that they help make website content more accessible to those with disabilities. You can use ARIA tags on certain types of dynamic content, interactive content, background images, and more.
  • Video closed captioning and/or transcripts: Some videos contain extremely relevant keywords but Google has no clue. Make sure what is heard and seen gets included in your indexable closed captioning or transcript.
  • Schema markup: Schema helps add context to content. When applicable, mark your keywords up with the most appropriate schema properties to remove some of the guesswork for Google.

Keyword integration tips

  • Don’t overdo it: Over-optimization is a real thing. It can turn away your customers and send you to Google’s dog (or Panda) house. Each one of the areas above has been automated, exploited, and tarnished. Ask yourself if it helps or hurts user experience. Make your decision based on that answer.
  • Ignore the meta keywords tag: The meta keywords tag gets little, if any, attention from the main search engines. Don’t waste your time here.
  • Don’t optimize each page in a vacuum: Unless you have a one-page site, you need to look at your keyword targeting by taking all pages into context. This will guard against any gaps or keyword cannibalization that can happen when you work on each page in a silo.
  • Test everything: If you have the opportunity to work on sites with a massive number of pages, you have a perfect opportunity to set up some worthwhile tests to polish your techniques.

When Won’t Keywords Matter?

How do we know keywords will always matter? In reality, there’s no way to know, but many of the root arguments shared in this guide have been the same for over 20 years, and they show no signs of pivoting.

With that said, I do think I can tell you the next time “keywords are dead” will ferociously bounce around the SEO echo chambers. Larry Page doesn’t just want Google to be at the level of a human, he wants it to be superhuman.

The introduction of Google Now has given us a glimpse of what is to come: Google searching for what we want without us having to ask.

If Google does our searching for us, would keywords still matter? Yes, but that’s for another time.


Image Credits

Featured Image: Paulo Bobita
Dinosaur Image: Clker-Free-Vector-Images/Pixabay
All other images, screenshots, and video taken by author, August 2017

By Tylor Hermanson

Sourced from Search Engine Journal

Before you dish out money to bid for a top-ranked ad position on a search engine, you may want to pause and make sure it’s actually going to pay off.

By MediaStreet Staff Writers

New research out of Binghamton University, State University of New York suggests that instead of just spending to get that top spot, advertisers should be considering other factors as well to ensure they are getting the best results from their sponsored search advertising campaigns.

Sponsored search advertising involves paying search engines, like Google and Bing, to bid for placements on the search results pages for specific keywords and terms. The ads appear in sponsored sections, separate from the organic search results, on those pages.

“The common belief in sponsored search advertising is that you should buy the top ad position to get more clicks, because that will lead to more sales,” said Binghamton University Assistant Professor of Marketing Chang Hee Park. “But the fee for the top position could be larger than the expected sales you’d get off that top position.”

Park, with the help of Binghamton University Professor of Marketing Manoj Agarwal, analysed data collected from a search engine and created a model that can forecast the number of clicks advertisers could expect in sponsored search markets based on four factors:

  • Rank in the sponsored listings
  • Website quality
  • Brand equity
  • Selling proposition

The model gives advertisers a way to quantify the expected clicks they’d get by adjusting these four factors, while also taking into consideration how their competitors are managing these four factors. This could enable advertisers to find a perfect blend of the four factors to ensure they are getting the most out of what they are paying for their ad positions.

It may also indicate that they should be spending more money to bolster their brand or website rather than amplifying their offers in top ad positions.

“Using this model, you may find that paying less for a lower ad position while investing more in improving your website is more effective than spending all of that money strictly on securing top ad positions,” said Agarwal.

This applies especially if your competitor has a poorer-quality website, but is spending more than you on securing top ad positions.

Their model found that poor-quality advertisers that are ranked higher in ad positions drive consumers back to the search results page, leading consumers to then click on advertisers in lower ad positions to find what they are looking for.

In contrast, they also found that a highly-ranked good-quality advertiser results in significantly less clicks for all the advertisers ranked below them.

“It’s more likely that in the top position, all advertisers being equal, you’ll get more clicks. But depending on these four factors, as well as the quality of your competitors, you may find that you’ll get more clicks in the second or the third position,” said Park.

“Conceptually, this is not a new idea, but now the model can help determine this by accounting for multiple factors at play at the same time.”

Advertisers aren’t the only ones who can benefit from this research.

Park and Agarwal’s model found that simply reordering the listed advertisers could result in significant changes in overall click volume (the total number of clicks across all advertisers) for search engines.

“Because they often charge on a pay-per-click model, search engines can now simulate which ordering of advertisers in a sponsored search market results in the most overall clicks and, therefore, most revenue” said Park. “Search engines may want to consider charging advertisers in a way that gives the search engine more flexibility in determining the order in which the ads in sponsored sections are displayed.”

 

 

Have you listed your company in our Media Directory? Click here to do it now.

Not all “likes” are equal.

By MediaStreet Staff Writers

While the trusty “like” button is still the most popular way to signal approval for Facebook posts, a computer model may help users and businesses navigate the increasingly complicated way people are expressing how they feel on social media.

In a study, researchers developed a social emotion mining computer model that one day could be used to better predict people’s emotional reactions to Facebook posts, said Jason Zhang, a research assistant in Penn State’s College of Information Sciences and Technology. While Facebook once featured only one official emoticon reaction – the like button – the social media site added five more buttons – love, haha, wow, sad and angry – in early 2016.

“We want to understand the user’s reactions behind these clicks on the emoticons by modelling the problem as the ranking problem – given a Facebook post, can an algorithm predict the right ordering among six emoticons in terms of votes?” said Zhang. “But, what we found out was that existing solutions predict the user’s emotions and their rankings poorly in some times.”

Zhang added that merely counting clicks fails to acknowledge that some emoticons are less likely to be clicked than others, which is called the imbalance issue. For example, users tend to click the like button the most because it signals a positive interaction and it is also the default emoticon on Facebook.

“When we post something on Facebook, our friends tend to click the positive reactions, usually love, haha, or, simply, like, but they’ll seldom click angry,” said Zhang. “And this causes the severe imbalance issue.”

For social media managers and advertisers, who spend billions buying Facebook advertisements each year, this imbalance may skew their analysis on how their content is actually performing on Facebook, said Dongwon Lee, associate professor of information sciences and technology. The new model – which they call robust label ranking, or ROAR – could lead to better analytic packages for social media analysts and researchers.

“A lot of the commercial advertisements on Facebook are driven by likes,” said Lee. “Eventually, if we can predict these emoticons more accurately using six emoticons, we can build a better model that can discern more precise distribution of emotions in the social platforms with only one emoticon – like – such as on Facebook before 2016. This is a step in the direction of creating a model that could tell, for instance, that a Facebook posting made in 2015 with a million likes in fact consists only 80 percent likes and 20 percent angry. If such a precise understanding on social emotions is possible, that may impact how you advertise.”

The researchers used an AI technique called “supervised machine learning” to evaluate their newly-developed solution. In this study, the researchers trained the model using four Facebook post data sets including public posts from ordinary users, the New York Times, the Wall Street Journal and the Washington Post, and showed that their solution significantly outperformed existing solutions. All four sets of data were analysed after Facebook introduced the six emoticons in 2016.

The researchers suggest future research may explore the multiple meanings for liking a post.

“Coming up with right taxonomy for the meanings of like is another step in the research,” said Lee. “When you click on the like button, you could really be signalling several emotions – maybe you agree with it, or you’re adding your support, or you just like it.”

And we as marketers know, the more you understand how your market feels, the better you can tailor your advertising to them.

 

Have you listed your company in our Media Directory? Click here to do it now.

Adapting SEO priorities to the customer journey and balancing organic versus paid search shape SEO strategy in 2018, according to new survey.

By MediaStreet Staff Writers

Social media marketing is the leading SEO service priority among businesses in 2018, according to new research from Clutch and Ignite Visibility. Over 90% of businesses that invest in SEO also invest in social media.

The survey of 303 marketing decision-makers reveals that most tend to shape their SEO strategy based on the SEO services they prioritise and the challenges they face.

Top SEO Priorities 2018

Two factors impact the direction of a business’ SEO strategy: The shifting customer journey and whether the business focuses on paid search or organic SEO services.

Organic SEO services include:

  • On-site optimisation – web design, site infrastructure, blogging
  • Off-site optimisation – content marketing, social media marketing

Over 40% of businesses that invest in SEO focus on organic services, compared to 19% that focus on paid search.

Businesses that focus on organic SEO are more likely to use in-house staff for general marketing, such as content marketing and social media. Over three-fourths (76%) of businesses that focus on organic services use in-house staff.

On the other hand, businesses that focus on paid search are more likely to hire an SEO company. More than two-thirds (68%) of businesses that focus on paid search hire an SEO company, compared to just 37% that rely on in-house staff.

Top SEO Priorities 2018 (PRNewsfoto/Clutch)

“Paid search complements organic SEO by providing feedback on keyword research, audience targeting, and effective ad copy,” said Eythor Westman, head of paid media at Ignite Visibility.

How businesses adapt to shifts in the customer buying journey is another factor that shapes SEO strategy. The rise of mobile search drives changes to the customer buying journey.

SEO experts agree that customers use their mobile devices to learn about a company through social media and site content before converting to make a purchase.

“Now, somebody Googles a keyword. Then they click on a top ranking term like, ‘SEO company.’ They read our blog and click around social media,” said John Lincoln, CEO of Ignite Visibility. “Then, they convert three weeks later after they feel comfortable with you.”

In response, businesses prioritise SEO services that facilitate the customer journey. Along with social media (20%), businesses rate creating content to earn links (15%), and mobile search optimisation (14%) as their top SEO priorities.

Read the full report here.

Have you added your company to the MediaStreet Directory? If not, then do it here!

A new survey indicates that 1 in 5 small businesses use social media in place of a website. Many assume a website is cost-prohibitive and may not consider the risks of not having one.

By MediaStreet Staff Writers

More than one-third (36%) of small businesses do not have a website, according to the websites section of the fourth annual Small Business Survey conducted by Clutch, a B2B research firm. One in five small businesses (21%) selectively use social media instead of a website in an effort to engage customers.

The survey indicates that small businesses consider cost a bigger concern than the potential repercussions of not having a website.

 

Social media platforms such as Facebook and Instagram attract small businesses by cultivating a highly engaged user base. However, relying solely on social media may be a risky strategy for businesses.

“Whenever you put all of your eggs into someone else’s basket, it’s risky,” said Judd Mercer, Creative Director of Elevated Third, a web development firm. “If Facebook changes their algorithm, there’s nothing you can do.”

Facebook recently announced changes that potentially increase the risk of using social media in place of a website. The social media platform plans to prioritise posts from family and friends over posts from brands.

This new policy may make it more difficult for small businesses to reach their audiences through social media. As a result, websites are expected to regain importance among businesses – as long as cost is not considered an obstacle.

Among small businesses that do not currently have a website, more than half (58%) plan to build one in 2018.

Some Small Businesses Say Website Cost is Prohibitive, But Others Cite Costs of $500 or Less

More than a quarter (26%) of small businesses surveyed say cost is a key factor that prevents them from having a website. However, nearly one-third of small businesses with websites (28%) report spending $500 or less.

Small businesses may not be aware that some web development agencies offer packages that defray costs by dividing website construction into multiple phases or sliding rates for small businesses. “You don’t necessarily need to launch with your first-generation website,” said Vanessa Petersen, Executive Director of Strategy at ArtVersion Interactive Agency, a web design and branding agency based in Chicago. “Maybe just start small.”

Mobile-Friendly Websites Becoming Standard
Businesses that do have websites are moving en mass to mobile friendly ones, the survey found. Over 90% of respondents said their company websites will be optimised for viewing on mobile devices by the end of this year.

In addition to the 81% of company websites that are already optimised for mobile, an additional 13% that say they plan to optimise for mobile in 2018.

Clutch’s 2018 Small Business Survey included 351 small business owners. The small businesses surveyed have between 1 and 500 employees, with 55% indicating that they have 10 or fewer employees.

To read the full report and source the survey data, click here.

 

 

Search engine optimization (SEO) very much revolves around Google today. However, the practice we now know as SEO actually pre-dates the world’s most popular search engine co-founded by Larry Page and Sergey Brin.

Although it could be argued that SEO and all things search engine marketing began with the launch of the first website published in 1991, or perhaps when the first web search engine launched, the story of SEO “officially” begins a bit later, around 1997.

According to Bob Heyman, author of “Digital Engagement,” we can thank none other than the manager of rock band Jefferson Starship for helping give birth to a new field that we would grow to know as “search engine optimization.”

You see, he was quite upset that the official Jefferson Starship website was ranking on Page 4 of some search engine at the time, rather than in Position 1 on Page 1.

Granted, we may never know if this tale is more revisionist history or 100 percent fact, all signs definitely point to the term SEO originating around 1997.

Do a little more hunting around and you’ll see John Audette of Multimedia Marketing Group was using the term as early as February 15, 1997.

Ranking high on search engines in 1997 was still a pretty new concept. It was also very directory driven. Before DMOZ fueled the original Google classification, LookSmart was powered by Zeal, Go.com was its own directory, and the Yahoo Directory was a major player in Yahoo Search.

If you’re unfamiliar with DMOZ, the Mozilla Open Directory Project (remember, Mozilla was a company and Moz was a brand well before SEOMoz), it was basically a Yellow Pages for websites. Which is what Yahoo was originally founded upon, the ability to find the best websites out there as approved by editors.

I started doing SEO in 1998, as a need for our clients who have built cool sites but were getting little traffic. Little did I know, it would become a lifestyle.

Then again, the World Wide Web was still a pretty new concept at the time to most people.

Today? Everybody wants to rule the search engine results pages (SERPs).

Search Engine Optimization vs. Search Engine Marketing

Before Search Engine Optimization became the official name, other terms were used as well. For example:

  • Search engine placement
  • Search engine positioning
  • Search engine ranking
  • Search engine registration
  • Search engine submission
  • Website promotion

But no discussion would be complete without mentioning another term: Search Engine Marketing.

At one point in 2001, one prominent industry writer suggested search engine marketing as a successor to search engine optimization.

Obviously, it didn’t happen.

Prepare yourself now: you’re going to see many false claims (e.g., “SEO is dead” “the new SEO”) and attempts at rebranding SEO (“Search Experience Optimization”).

While SEO as a term isn’t perfect – after all, we aren’t optimizing search engines, we’re optimizing our web presence – it has remained the preferred term of our industry for 20 years now and likely will be for the foreseeable future.

As for Search Engine Marketing – it is still used but is now more associated with paid search. The two terms co-exist peacefully today.

A Timeline of Search Engine History

Search engines have changed the way we find information, conduct research, shop for products and services, entertain ourselves, and connect with others.

Behind almost every online destination – whether it’s a website, blog, social network, or app – is a search engine. Search engines have become the connecting force and directional guide to everyday life.

But how did this all start?

We’ve put together a timeline of notable milestones from the history of search engines and search engine optimization to understand the roots of this technology, which has become such an important part of our world.

Dawn of SEO: “The Wild West” Era

In the last decade of the 1900s, the search engine landscape was highly competitive. You had your choice of search engines – both human-powered directories and crawler-based listings – including the likes of AltaVista, Ask Jeeves, Excite, Infoseek, Lycos, and Yahoo.

In the beginning, the only way to perform any kind of SEO, was through on-page activities. This included making sure the content was good and relevant, there was enough text, your HTML tags were accurate, and that you had internal and external links, among other factors.

If you wanted to rank well in this era, the trick was pretty much just repeating your keywords enough times throughout your webpages and meta tags. Want to outrank a page that uses a keyword 100 times? Then you’d use the keyword 200 times! Today, we call this practice spamming.

Here are some highlights:

  • 1994: Yahoo was created by Stanford University students Jerry Wang and David Filo in a campus trailer. Yahoo was originally an Internet bookmark list and directory of interesting sites. Webmasters had to manually submit their page to the Yahoo directory for indexing so that it would be there for Yahoo to find when someone performed a search. AltaVista, Excite, and Lycos also launched.
  • 1996: Page and Brin, two Stanford University students, built and tested Backrub, a new search engine that ranked sites based on inbound link relevancy and popularity. Backrub would ultimately become Google. HotBot, powered by Inktomi, also launched.
  • 1997: Following on the success of A Webmaster’s Guide to Search Engines, Danny Sullivan launched Search Engine Watch, a website dedicated to providing news about the search industry, tips on searching the web, and information about how to rank websites better. (Ten years later, after leaving SEW, Sullivan founded another popular search publication, Search Engine Land.) Ask Jeeves also debuted and Google.com was registered.
  • 1998: Goto.com launched with sponsored links and paid search. Advertisers bid on Goto.com to rank above organic search results, which were powered by Inktomi. Goto.com was ultimately acquired by Yahoo. DMOZ (the Open Directory Project) became the most sought-after place for SEO practitioners to get their pages listed. MSN entered into search with MSN Search, initially powered by Inktomi.
  • 1999: The first-ever all search marketing conference, Search Engine Strategies (SES), took place. You can read a retrospective on that event by Sullivan here. (The SES conference series continued running under various monikers and parent companies until shutting down in 2016.)

The Google Revolution

In 2000, Yahoo pulled off the worst strategic move in the history of search and partnered with Google and let Google power their organic results instead of Inktomi. Beforehand Google was a little-known search engine. Hardly known! The end result: every Yahoo search result said “Powered by Google” and they ended up introducing their largest competitor to the world and Google became a household name.

Until this point, search engines mainly ranked sites based on the on-page content, domain names, ability to get listed in aforementioned directories, and basic site structure (breadcrumbing). But Google’s web crawler and PageRank algorithm were revolutionary for information retrieval. Google looked at both on-page and off-page factors – the quantity and quality of external links pointing to a website (as well as the anchor text used).

If you think about it, Google’s algorithm was essentially about “if people are talking about you, you must be important.”

Although links were only one component of Google’s overall ranking algorithm, SEO practitioners latched onto links as being the most important factor – and an entire sub-industry of link building was created. Over the next decade, it became a race to acquire as many links as possible in the hopes of ranking higher and links became a heavily abused tactic that Google would have to address in coming years.

It was also in 2000 that the Google Toolbar became available on Internet Explorer, allowing SEO practitioners to see their PageRank score (a number between 0-10). This ushered in an era of unsolicited link exchange request emails.

So with PageRank, Google essentially introduced a measure of currency to its linking. Much like domain authority is misused today.

Google’s organic results also got some company in the form of AdWords ads starting in 2000. These paid search ads began appearing above, below, and to the right of Google’s unpaid results.

Meanwhile, a group of webmasters informally got together at a pub in London to start sharing information about all things SEO in 2000. This informal gathering eventually turned into Pubcon, a large search conference series that still runs today.

Over the coming months and years, the SEO world got used to a monthly Google Dance, or a period of time during which Google updated its index, sometimes resulting in major ranking fluctuations.

Although Google’s Brin once famously said Google didn’t believe in web spam, his opinion had probably changed by the time 2003 rolled around. SEO got a lot harder following updates like Florida because it became much more important than just repeating keywords x amount of times.

Google AdSense: Monetizing Terrible SEO Content

In 2003, after acquiring Blogger.com, Google launched AdSense, which serves contextually targeted Google AdWords ads on publisher sites. The mix of AdSense and Blogger.com leads to a surge in monetized simple Internet publishing and a blogging revolution.

While Google probably didn’t realize it at the time, they were creating problems they would have to fix down the road. AdSense gave rise to spammy tactics and Made for AdSense sites filled with thin/poor/stolen content that existed solely to rank well, get clicks, and make money.

Oh and something else important happened in 2003. I founded the site you’re on, Search Engine Journal! And I’m incredibly happy to say we’re still here, going stronger than ever!

Local SEO & Personalization

Around 2004, Google and other top search engines started improving results for queries that had a geographic intent (e.g., a restaurant, plumber, or some other type of business or service provider in your city or town). By 2006, Google rolled out a Maps Plus Box, which I was quite impressed by at the time.

It was also around 2004 that Google and search engines began making greater use of end-user data, such as search history and interests, to personalize search results. This meant that the results you saw could be different than what another person sitting next to you in a coffee shop when searching for the same query.

Also in 2005, nofollow tags were created as a means to combat spam. SEO pros began using this tag as a way of PageRank sculpting.

Google also unleashed a couple of noteworthy updates:

  • Jagger, which helped to diminish the level of unsolicited link exchanges that were flying around, as well as heralding the decline in the importance of anchor text as a factor due to its corruptibility.
  • Big Daddy (coined by Jeff Manson of RealGeeks), which improved the architecture of Google to allow for improved understanding of the worth and relationship of links between sites.

YouTube, Google Analytics & Webmaster Tools

In October 2006, Google acquired user-generated video sharing network YouTube for $1.65 billion, which ultimately became the second most used search property in the world.

Today, YouTube has more than a billion users. Due to its soaring popularity, video SEO become crucial for brands, businesses, and individuals that wanted to be found.

Google also launched two incredibly important products in 2006:

  • Google Analytics. This free, web-based tool was so popular at launch that webmasters experienced downtime and maintenance warnings.
  • Google Webmaster Tools. Now known as the Search Console, Google Webmaster Tools let webmasters view crawling errors, see what searches your site showed up for, and request reinclusion.

Also in 2006 XML sitemaps gained universal support from the search engines. XML sitemaps allow webmasters to display to the search engines, every URL on their website that is available for crawling. An XML sitemap contains not only a list of URLs but a range of further information, which helped search engines to crawl more intelligently.

Universal Search

We really began to see search starting to evolve in new and exciting ways starting in 2007. All of these updates were aimed at improving the user experience.

Let’s start with Google’s Universal Search. Until this point, the search results had consisted of 10 blue links.

Then Google began blending traditional organic search results with other types of vertical results like news, video, and images. This was easily the biggest change to Google search – and SEO – since the Florida update.

Cleaning up the Cesspool

In 2008, then-Google CEO Eric Schmidt said the Internet was becoming a cesspool and that brands were the solution. “Brands are how you sort out the cesspool,” he said.

Less than six months after his comment, along came a Google update called Vince. Big brands suddenly seemed to be ranking a whole lot better in the SERPs.

But it wasn’t really intended to reward brands, according to Google. Google wanted to put a greater weight on trust in the algorithm (and big brands tend to have more trust than smaller and less-established brands).

Shortly after this update, Google releases another to improve the speed of their indexing, called Caffeine. As SEJ reported at the time, Caffeine was “a next-generation search architecture for Google that’s supposed to be faster and more accurate, providing better, more relevant results and crawling larger parts of the web.”

Speaking of speed, in 2010 Google announced that site speed was a ranking factor.

Bing & The Search Alliance

In 2009, Microsoft Live Search became Bing. Then, in an attempt to challenge Google’s nearly 70 percent grip of the U.S. search market, Yahoo and Microsoft joined forces to partner on a 10-year search deal (though it ended up being reworked five years later).

The Search Alliance saw Microsoft’s Bing power Yahoo’s organic and paid search results. While it made Bing the clear Number 2 search engine, they ultimately have failed to break Google’s massive grip on search in the U.S. and globally.

The Rise of Social Media

Another phenomenon was emerging late in the 2000s – social networks.

Google made its big bet on YouTube (although it would try again with Google+). But other networks like Facebook, Twitter, and LinkedIn all emerged as major players (with many more to come and go in the following years).

Along with the rise of social media came speculation that social signals can impact search rankings. Yes, social media can help SEO, but indirectly – just as other forms of marketing can help drive more traffic to your website and increase brand awareness and affinity (which generates search demand).

While the impact of social shares (likes, tweets, +1’s, etc.) has been denied time and again by Google through the years as being ranking factor, it continued to be listed as having a strong correlation in various ranking factor studies. If you want to read more about this topic, I highly suggest reading How Social Media Helps SEO [Final Answer].

The Google Zoo: Panda & Penguin

Two major algorithmic updates, in 2011 and 2012, had a big impact on SEO that is still being felt to this day as Google once again attempted to clean up its search results and reward high-quality sites.

In 2011, Google found its search results facing severe scrutiny because so-called “content farms” (websites that produced high volumes of low-quality content) were dominating the search results. Google’s SERPs were also cluttered with websites featuring unoriginal and auto-generated content – and even, in some instances, scraper sites were outranking content originators.

As a result, these sites were making tons of advertising revenue (remember when I mentioned Google’s self-made AdSense problem?). These sites were also living and dying by organic traffic from Google.

But once Google’s Panda update rolled out in 2011, many websites saw much, if not all, of that traffic vanish overnight. Google provided some insight on what counts as a high-quality site.

Aimed at eliminating low-quality (or thin) content, Panda was updated periodically over the coming years, eventually becoming integrated into Google’s core algorithm in 2016.

With websites still recovering from the effects of Panda, Google unleashed a hotly anticipated over-optimization algorithm, intended to eliminate “aggressive spam tactics” from its results. Eventually dubbed Penguin, this algorithm targeted link schemes (websites with unusual linking patterns, including a high-amount of exact match anchor text that matched keywords you wanted to rank for) and keyword stuffing.

Penguin wasn’t updated nearly as frequently as Panda, with more than a year passing between some updates. And, like Panda, Penguin became part of Google’s real-time algorithm in 2016.

Things, Not Strings

In May 2012, Google unveiled the Knowledge Graph. This was a major shift away from interpreting keywords strings to understanding semantics and intent.

Here’s how Google’s Amit Singhal, SVP, engineering, described it at launch:

“The Knowledge Graph enables you to search for things, people or places that Google knows about – landmarks, celebrities, cities, sports teams, buildings, geographical features, movies, celestial objects, works of art and more – and instantly get information that’s relevant to your query. This is a critical first step towards building the next generation of search, which taps into the collective intelligence of the web and understands the world a bit more like people do.”

Google enhanced its search results with this information. Knowledge panels, boxes, and carousels can appear whenever people do a search for one of the billions of entities and facts in the Knowledge Graph.

The next step in Google’s next generation of search came in September 2013 in the form of Hummingbird, a new algorithm designed to better address natural language queries and conversational search. With the rise of mobile (and voice search), Google needed to completely rebuild how its algorithm worked to meet the needs of modern searchers.

Hummingbird was considered to be the biggest change to Google’s core algorithm since 2001. Clearly, Google wanted to deliver faster and more relevant results, especially to mobile users.

Mobile-First

Starting somewhere around 2005 or so, one question kept being asked in our industry. Is this the “Year of Mobile”?

Well, it turns out that it wasn’t in 2005. Or 2006. Neither was 2007. Or 2008. Or 2009. Not even 2010 – when Google transformed itself into a mobile-first company.

Then 2011, 2012, 2013, and 2014 came and went. Mobile was talked about and much hyped because it was growing like crazy all this time. As more users adopted smartphones, they were increasingly searching for businesses and things while on the move.

Finally, in 2015, we had the Year of Mobile – the point at which mobile searches overtook desktop search for the first time on Google. And while this is true in terms of raw search numbers, it’s also true that search intent is quite different and conversion rates remain much lower on mobile devices.

This was also the year that comScore reported mobile-only internet users surpassed desktop-only users.

It was also in 2015 that Google launched a much-anticipated mobile-friendly algorithm update, designed to give users “the most relevant and timely results, whether the information is on mobile-friendly web pages or in a mobile app.”

In an attempt to speed up pages, Google also introduced Accelerated Mobile Pages (AMP) in 2016. AMP are designed to instantly load content and mostly has been adopted by news media and publishers.

And there’s much more mobile to come. Next up: a mobile-first index is on the way sometime in 2018.

Machine Learning, AI & Intelligent Search

Earlier, I mentioned that Google, originally built around information retrieval, became a mobile-first company. Well, that changed in 2017 because Google CEO Sundar Pichai declared Google an AI-first company.

Today, Google search is designed to inform and assist, rather than giving users a list of links. That’s why Google has built AI into all of its products – including search, Gmail, AdWords, Google Assistant, and more.

In terms of search, we’ve already started to see the impact of AI with Google RankBrain. Announced in October 2015, RankBrain was initially used to try to interpret the 15 percent of searches that Google has never seen before, based on the words or phrases the user has entered.

Since that time, Google has expanded RankBrain to run on every search. While RankBrain impacts ranking, it isn’t a ranking factor in the traditional sense, where you get rewarded with better rankings for doing x, y, and z.

And there’s much more coming soon in the world of intelligent search.

Voice searches are increasing. Visual search has gotten insanely good. And users (and brands) are increasingly adopting chatbots and using personal assistants (e.g., Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana).

Exciting times are ahead for SEO.

Conclusion

Search engines and SEO have come a long way since the 1990s. And we’ve only touched on a few of these ways in this post.

The history of SEO has been filled with exciting turns – the birth of new search engines, the death of old search engines, new SERP features, new algorithms, and constant updates, plus the emergence of great SEO publications, conferences, tools, and experts.

While search engines and SEO have evolved greatly over the years, one thing remains true: as long as there are search engines, SEO will remain vital. And we’ve only gotten started!

 

Featured Image Credit: Paulo Bobita

By Loren Baker

Sourced from Search Engine Journal

By 

Search engine optimization can seem like an alien concept to those unfamiliar with it, and one of the biggest struggles for marketers tasked with adopting search engine marketing is explaining the concept to a boss or client who is brand new to SEO.

There’s no doubt that SEO is a complex and ever-evolving marketing process – so how do you go about communicating how it works and why it’s important?

Here are three actionable tips professionals can use to explain SEO to their boss or client.

Tell them it’s about humans, not bots

When someone dives into SEO for the first time, they might feel intimidated by the mention of search bots and algorithms, and wonder why these things are being favored over marketing to real humans.

But the thing is, SEO is for humans.

Google uses bots and algorithms to help humans discover content most valuable to them, and we, in-turn, need to create human-friendly experiences and content to succeed.

Tell your boss or client that by adopting SEO strategies, they’ll also be working on giving their customers a better experience with their brand or business, which is always a positive step.

Provide simple examples of SEO techniques

Link profiles, algorithm updates, on-page and off-page ranking factors, metadata, keywords, semantic search. These things might all make sense to you, but it’s nonsense to the uninitiated. Don’t hit your boss or client with a bunch of technical terms. Instead, go back to basics, such as:

“When a website links to your own, it helps Google to know what your site is about and that it has useful information.

“Google looks at the actual words on a web page, as well as information built into your site that can’t be seen by human users, to understand what the page is about.”

“Keywords are words or phrases humans type into Google to find what they’re looking for on the web.”

Break everything down into simple, accessible concepts, which not only explain what it is, but why it works.

As their knowledge grows, you’ll find it easier to describe the more complex aspects of SEO to them.

Give them a brief history

For those of us who’ve worked in the industry for some time, it’s easy to understand the value of SEO because we’ve been a part of it while it has evolved. We understand the purpose behind new algorithm changes and can appreciate why some SEO tactics are more valuable than others. For someone without this knowledge, it can be more difficult to wrap their head around why and how SEO works.

Bubblegum Search have created this short, sharp interactive timeline on the history of SEO, and it’s the perfect place for beginners to get up to speed on why SEO exists and how it has grown into what it is today.

history of SEO

 

By 

Follow Matt Cayless on Twitter

Sourced from Social Media Today

By Carolyn Lyden

Keyword Research Is the Biggest Predictor of SEO Success

If you work in search marketing today, you know how difficult it is to stay on top of changes, trends, and tactics. From algorithm changes to SERP format updates, the tactical toolbelt of SEOs is always changing.

As a result of the pace of those changes, we often lose sight of the underlying semantic relationship between queries and that content we work so hard to create, optimize, and get in front of searchers. The most successful SEOs, however, never forget that the only way to optimize their content for user intent is through ongoing, in-depth keyword research.

Yet survey data shows search marketers still struggle with keyword research. A study from AWR found that marketers ranked keyword research the third most difficult SEO task to perform (behind link-building and content creation). The same survey found that most SEOs perform their keyword research in-house instead of outsourcing it, which may be why it’s such a difficult task and why around 42% perform it only when necessary.

Keyword research is tedious and time-consuming—even when performed on a quarterly basis. But it’s vital for success, because keyword research is all about determining user intent. And without user intent, there is nothing guiding the content we create, the backlinks we aim to earn, or the on-page content we optimize.

In my post “Search Engine Optimization Is Now User Experience Optimization,” I go over specific ways to find user intent in Google Analytics: “By looking at your current bounce rate, time on site, and pages per session, you can see where your user journey drops off, what keywords you might be going after that you don’t necessarily need to, what mediums bring in the most qualified leads, and more.”

Note: If you’re work in SEO, I highly recommend tuning into our Pro Webinar series, where we often go over tips, tools, and examples of how to best align your search marketing with user intent.

The Problem With the Way Most SEOs Do Keyword Research

Most of us do keyword research at the beginning of a project or, at best, twice per year on a current project. Performed en masse at regular intervals, keyword research is clumsy, and the findings usually lag real-time user intent. If something with your product, service, or industry changes in a single week, the keyword research you did three months ago won’t account for that.

That doesn’t mean that you should eschew the practice. Instead, it means your keyword and content research strategy just needs to be more limber and data-based. We can learn a lot from how highly successful SEOs approach the research and what tools they use.

Four Keyword Research Tools to Win Traffic and Influence Searchers

You might be familiar with a few of these already; however, the way you use them and incorporate what you learn into your SEO strategy is the key. The second one is a less common one that we’ve used in-house at CallRail to dramatically increase our content’s alignment with the exact words a user searches when researching our product.

I guarantee that if you begin using these research tools for your 2018 SEO strategy, and use what you learn to create content that better serves your users’ needs, next year you’ll see organic traffic gains you didn’t know were possible.

1. Google Search Console for Detailed User Info

SEOs might overlook Search Console because it’s free and has a limited date range, but the keyword information is good if you do a little digging. In the Search Analytics section, you can look at the clicks, impressions, CTR, and position of all the queries that your site showed up for in the SERPs. You can dig through and find keywords with high impressions but low CTR and try to see why there’s a mismatch there: Is that page that shows on SERPs not serving the user’s needs?

You can also look at the query information for individual pages on your site to determine what they actually rank for (as opposed to what you want them to rank for). With this data, you can hypothesize why and formulate tests or improvements.

Search Console is a great place to get long-tail keyword data, as well. You can use that long-tail data from Search Console to see related questions in the other tools mentioned here.

2. CallRail for Call Recording Software

Call recordings help you figure out how leads actually talk about your products and services. People who pick up the phone to call about a product or service are generally pretty warm leads or even customers already. So, listening to these conversations should be considered keyword research—because it’s probably the way they searched for you to begin with.

In a detailed blog post on keyword research and call tracking, I talk about how the way you talk about your product or service isn’t always the way your potential clients do. And you can find out how they talk by listening to call recordings.

Sometimes the industry-specific jargon we use on our sites isn’t the same kind of language that users type when they search for solutions to their problems. You can hear how clients refer to your products and services in call recordings, and use the client’s own language on your marketing materials

Similarly, call recordings can show you whether you’re ranking in SERPs for something that you didn’t intend. An environmentally friendly cleaning company, for example, may want to promote its use of safe cleaning products that are free from harmful chemicals and bleach. Its marketing department creates campaigns and content around “ammonia-free cleaning services” and similar terms. However, upon receiving calls to the sales department, the company realizes that it’s actually showing up for terms like “free cleaning services.”

In this case, the cleaning company may want to optimize for a different phrase, such as “green cleaning services.” You can get that kind of one-on-one, detailed keyword information only from something like a call recording.

For example, a call recording for these plumbers shows that a quick blog post on how to install a toilet might be a beneficial addition to the content on their site:

Note: If you want to learn more about how call recordings can benefit your specific business, schedule a personalized demonstration of CallRail’s powerful and intuitive call analytics software.

3. Answer the Public and BuzzSumo for User Questions

Both tools help with the specific questions that actual humans ask about topics and issues. Answer the Public is a free tool that brings you all the questions that come from the autosuggest results in Google and Bing: “As you type you are presented with an aggregated view of the questions & therefore a hint of the motivations & emotions of the people behind each search query,” the company states.

For example, a local plumber could look into Answer the Public to see what questions people search related to toilets. Based on those questions, the company could write an FAQ section, specific how-to posts (like “How to Install a Toilet”), or answer common questions (like “Why does my toilet fill slowly?”).

BuzzSumo is a paid tool that does something similar. The tool has a Question Analyzer where you can put in a topic and see the actual questions that people have asked related to that topic across forums and the Web. You can choose to either go into the forum and answer questions as an expert or use those questions to determine what answers you provide on your own site.

Both of these tools give you up-to-date data on the searches of real people that you can use to craft your content.

4. SpyFu for Competitor Keyword Info

SpyFu is another paid tool with a wealth of keyword data. With it, you can compare your organic keyword rankings to those of your top competitors:

The data shows you what keywords your competitor ranks for that you don’t, and keywords that you’re both vying for. It also shows you keywords that you are ranking for exclusively. These keywords often include long-tail keywords related to your business as well as the search volume, paid data, and the potential difficulty of ranking for that keyword:

This keyword data should be the blueprint that informs your content strategy.

Join the Ranks of Elite SEOs: Make Keyword Strategy Your SEO Strategy

Keyword research gives SEOs the basis we need to make the rest of our strategic decisions. You can plan your content and technical implementation around the information you learn from your keyword research—and it doesn’t have to be tedious and time-consuming. These tools allow you to keep your research fresh, agile, and true to what users are actually searching for at that time.

Use these tools to start improving your SEO through keyword research, and watch your organic leads and conversions increase.

By Carolyn Lyden

Carolyn is based in Atlanta, where she put a love of spreadsheets, an MBA, and years of marketing experience together when she joined CallRail’s marketing team as the SEO manager in March 2017.

CallRail provides lead tracking and call analytics tools for more than 50,000 businesses across the US and Canada. Want to know where your leads are coming from and how qualified they are? Sign up for a completely free 14-day trial of CallRail’s award-winning software. No credit card needed.

Sourced from MarketingProfs

People are freaked out by ads that follow them around after a google of the product.

By Mediastreet Staff Writers

Personalised ads now follow us around the web, their content drawn from tracking our online activity. We in the ad industry have suggested that people are okay with it – that people see benefits roughly equal to perceived risks.

A study by University of Illinois advertising professor Chang-Dae Ham says otherwise, suggesting the ad industry may want to reconsider its approach.

“The perception of risk is much stronger than the perception of benefit,” Ham found in surveying 442 college students on how they coped with what is known as online behavioural advertising. “That drives them to perceive more privacy concern, and finally to avoid the advertising,” he said.

Previous studies have looked at various aspects of online behavioural advertising (OBA), but Ham said his is the first to investigate the interaction of various psychological factors – or mediating variables – behind how people respond to it and why they might avoid ads.

“The response to OBA is very complicated,” he said. “The ad avoidance is not explained just by one or two factors; I’m arguing here that five or six factors are influencing together.”

Ham examined not only interactions related to risk, benefit and privacy, but also self-efficacy (sense of control); reactance (reaction against perceived restrictions on freedom); and the perceived personalization of the ads.

He also looked at the effect of greater and lesser knowledge among participants about how online behavioural advertising works. Those with greater perceived knowledge were likely to see greater benefits, but also greater risk, he found. Similar to those with little perceived understanding, they tilted strongly toward privacy concerns and avoiding ads.

Ham’s study of online behavioural advertising follows from his interest in all forms of hidden persuasion, and his previous research has looked at product placement, user-generated YouTube videos and advergames. But OBA is “a very special type,” he said, in that it elicits risk perceptions and privacy concerns different from those in response to those other forms.

The study conclusions could have added significance, Ham said, because research has shown that college-age individuals, like those in his study pool, are generally less concerned about privacy than those in older age groups.

If his findings are an accurate reflection of consumer attitudes, Ham said they could represent “a really huge challenge to the advertising industry” since online behavioural advertising represents a growing segment of advertising revenue.

Ham thinks advertisers, in their own interest, may want to make the process more transparent and controllable. “They need to educate consumers, they need to clearly disclose how they track consumers’ behaviour and how they deliver more-relevant ad messages to them,” he said.

Giving consumers control is important because it might keep them open to some personalised online advertising, rather than installing tools like ad blockers, in use by almost 30 percent of online users in the U.S., he said.

With little understanding of online behavioural advertising, and no easy way to control it, “they feel a higher fear level than required, so they just block everything.”

It’s all the more important because the technology is only getting better and more accurate, Ham said. Tracking systems “can even infer where I’m supposed to visit tomorrow, where I haven’t visited yet.”