Tag

algorithm

Browsing

By Mike Wickham

As the sands shift around digital marketing, says Mike Wickham of Impression, it might be time to reconsider how we target customers online.

Good marketing should always be a win-win. The consumer should win because they’re being provided with a relevant option for whatever it is they’re in the market for. The brand should win by meeting that need and by providing its product or service to the right audience, hopefully, at the right cost.

As someone who navigates both the world of marketing and consumerism, I’m noticing a worrying trend towards fewer, less relevant options presented across paid media platforms.

The algorithm isn’t always our friend

Let me give you an example. I was recently on a quest to find the perfect pair of shoes. Versatile enough for all seasons, suitable for both smart and casual attire, and durable for years to come. Alas, I’m still searching, and not just because I’m incredibly fussy.

My customer journey began the same as most, with a broad search on Google, and I was served a range of options from boots to sandals. Not quite right, but after navigating to the shopping tab, I found a few items closer to what I was picturing in my head.

After clicking on a few options from different brands and browsing their catalogues I still hadn’t found the dream pair, but I had at least narrowed down the style I was looking for. So I returned to Google and provided a bit more detail for my next search (long-tail searches do still exist), only to receive virtually the same list of items in the carousel as before.

The results were pretty much exclusively from the three brands that I just visited. For the following days and weeks, browsing across the web provided me with limited new suggestions. I was re-served the same items time and time again. A poor use of frequency capping is partly at fault here, but the crux of it is, my behaviour gave signals that I was interested in these items, and so the algorithms pushed hell for leather to get me to convert.

I sympathize with these brands, and advertisers in general, who face similar challenges. With a shift towards larger audience definitions and a heavier reliance on machine automation, they’re a little at the mercy of the algorithms to distinguish who is the right customer.

How to identify the most likely customers

So what can we do to help differentiate between a person who clicks a visual ad of a product, engages with the website and decides the product isn’t quite right for them, versus a person who clicks a visual ad of the product, engages with the website and then decides that while they most likely will buy, they first want to compare prices elsewhere and wait for payday?

It ultimately comes down to developing a better understanding of the behaviour and psychology of your consumers. There are often more reasons not to buy something than there are to buy it, so we must begin to dig much deeper.

It starts with research. Understanding consumer behaviour to uncover the ’why’ behind the engagement – as well as the ’why not’. Is it to do with affordability, a lack of urgency, or too much choice? Or is it down to concerns over compromise, distraction, likeability, trust, principles, ethics… and so much more? The list of conscious and subconscious reasons for not proceeding can be many and varied.

Behavioural insight often starts with old-fashioned methods, like actually talking to people. Focus groups, surveys and questionnaires are often seen as archaic to digital-first businesses, but they will provide the insights that will help you identify where to begin looking within the data.

Don’t chase every would-be buyer

We have to measure in different ways than before. Parsing small but significant signals of consumer intent, such as attention mapping, engagement depth, dwell time, and frequency of interaction, will help to build a clearer picture between a genuinely interested buyer and a passer-by.

By identifying and excluding those who have shown signals of dis-intent, we’re able to better place our energy into more qualified customers, while the same data informs how we adapt our customer journeys to capitalize on the ‘likely buyers’.

We ultimately need to be better at understanding our customers’ wants and needs. And a key part of this is knowing when to pursue them, and when to let them go. Algorithms have made it harder to do the latter, as they miss the context and the cognitive reasoning in the mind of the decision-maker.

Those are the gaps we need to fill, and it’s the combination of blending behavioural insights with your machine learning tools that will not only help the marketer become more effective with their advertising spend, but also help bring back the relevancy to the consumer.

Like I said – win, win.

Feature Image Credit: Remy Gieling via Unsplash

By Mike Wickham

Sourced from The Drum

Or how I learned to stop worrying and love the algorithm.

You know how your Instagram feed starts sending you ads for khakis the minute you think about how you need a new pair of pants? Well, spirits giant Diageo is further immersing itself in the world of tech that knows what you want before you know what you want with the acquisition of flavour matching company Vivanda.

While not quite as nefarious sounding as the real life blocking or memory recall of a Black Mirror episode, this is indeed a look at what the future may hold for whisky consumers. Diageo has actually been using Vivanda’s technology since 2019 in several markets, including the “Journey of Flavour” experience at Johnnie Walker Princes Street in Edinburgh, as well as stores, ecommerce channels and the website Malts.com. It’s also the foundation of the “What’s Your Whisky” website, which works like this: Vivanda’s “FlavorPrint” system is powered by artificial intelligence, and by asking you a series of questions it’s able to map out your individual flavour preferences and suggest which whisky you should try based on your specific “Flavour Print.” Once you get your results, you are able to click to purchase a bottle of Talisker or Lagavulin or Oban, depending on your results.

Diageo plans to expand the use of Vivanda’s technology to other categories within its sizable portfolio, as well as using it to support “the continued development of our advanced analytics and digital marketing capabilities” to provide better understanding of just exactly what it is you like to drink, according to a press release. “We know consumers are looking for more personalized, interactive experiences and that they are increasingly engaging with our brands digitally as well as in person,” said Diageo chief marketing officer Cristina Diezhandino in a prepared statement. “We’re delighted to welcome Vivanda to Diageo and we are looking forward to working together to connect with consumers in more innovative ways that help shape the future of how we socialize in person and virtually.” So far the whisky has not become sentient and experienced its first sensation of love, but we are still in early days.

Feature Image Credit: Charl Folscher/Unsplash

Sourced from Robb Report

Sourced from News18

Twitter is positioning itself to eventually allow brands to sell products through the service by first improving on its ability to show users relevant ads and increasing the likelihood they will click the ad.

Twitter on Tuesday rolled out new ad features and revamped the algorithm that decides which ads users see, as part of an effort to lay the groundwork to launch future ecommerce features, the social networking company told Reuters. The new features come as Twitter is pushing to grow its performance advertising business, a strategy that aims to quickly generate sales, and constituted just 15 percent of Twitter’s business last year. The effort could help Twitter reach its goal of doubling annual revenue by 2023.

The San Francisco-based company is positioning itself to eventually allow brands to sell products through the service by first improving on its ability to show users relevant ads and increasing the likelihood they will click the ad. “Performance ads are a very large opportunity … that’s relatively untapped for us,” said Kamara Benjamin, group product manager at Twitter, in an interview. “Ultimately, this is going to lead to people installing apps, visiting websites and finding products that meet their needs.”

Ads promoting downloads for mobile games and other apps, which are a major type of ad on social media sites, will now allow users to initiate the download without leaving the Twitter app, the company said in a blog post on Tuesday. Previously, users had to leave Twitter to download other apps. Twitter added it is working on new tools to let companies run ads to find customers who are more likely to make in-app purchases.

Slide-show ads that feature multiple products can now send users to different websites when they click the ad, whereas previously brands could only choose one destination. This increased the number of clicks by 25 percent on ad campaigns that set a goal of driving website visits, the company said.

Twitter also improved the advertising algorithm, showing the ads to a larger pool of people at the beginning of the campaign so it can better gauge user interest, Benjamin said. Those algorithm improvements led to a 36% increase in ad campaigns that achieved at least five downloads during the time period that the ad ran on Twitter, the company said.

Sourced from News18

By 

A few weeks has passed since Google rolled out their latest broad core algorithm update. Articles have circulated that highlighted the ones that experienced massive wins and other sites that experienced the opposite. The prevailing factor that I see whenever Google rolls out broad core algorithm updates are questions such as “how do I fix it” or “what did I do wrong?” or “I didn’t do anything but my site traffic improved, why is that?”. There’s a variety of answers given to them, but they don’t seem to get the whole purpose of a broad core algorithm update. But before we delve into recovering if you were hit by the broad core update, what exactly is a broad core update?

What Is a Broad Core Update?

Simply put, it’s an algorithm rolled out by Google multiple times a year that doesn’t necessarily target specific issues. A broad core update is more of an update to their main search algorithm that deals with a more holistic view of a website and its expertise, authoritativeness, trustworthiness (E-A-T), and it’s quality.

Because of these many, varying factors, Google can’t really give us what needs to change in a website without them revealing the most important aspects of their algorithm. Think about it this way: There are 150 ranking factors that have varying importance in Google’s eyes. When they roll out broad core updates, 63 of those ranking factors’ importance changes and their order is rearranged. Of course, this is just an example and not a fact. But it’s a better way of imagining what really happens during a broad core update.

Here’s an example of how different broad core updates affected one of our clients:

Screenshot of Analytics Traffic and Google Updates

This shows us how different broad core algorithm updates affect websites in a varying manner every time. The blue dots above the graph show when the broad core updates happened. So, we can infer that regular broad core algorithm updates target different aspects every time they are rolled out – not the same thing over and over again.

Google’s Take on Recovering From Broad Core Updates

Getting hit by the algorithm update is a normal occurrence especially if your website still has a lot of room for improvement. In a recent Google Webmaster Hangouts, a webmaster asked a question to Google’s own John Mueller regarding the drop in traffic for their news site. Here’s the full question:

“ We’re a news publisher website that’s primarily focusing on the business finance vertical. We probably have been impacted by the June Core Update as we’ve seen a traffic drop from the 1st week of June.

Agreed that the update specifies that there are no fixes, no major changes that need to be made to lower the impact. But for a publisher whose core area is content news, doesn’t it signal that it’s probably the content, the quality or the quantity, which triggered Google’s algorithm to lower down the quality signal of the content being put up on the website which could’ve lead to a drop of traffic?

We’re aware that many publisher sites have been impacted. In such a scenario, it would really help if Google could come out and share some advice to webmasters and websites. Not site specific, but category or vertical specific at least on how to take corrective measures and actions to mitigate the impact of core updates. It would go a long way in helping websites who are now clueless as to what impacted them.”

Screenshot of Google Webmaster Hangouts

John Mueller went on to give the best answer he could possibly give, and here’s a summary of his answer (not verbatim):

“ There’s nothing to fix since there’s no specific thing that the algorithm targets. A variety of factors that relate to a website sometimes evolves, and that affects your traffic and rankings. There are no explicit changes you can do, but there is an old blog post (published 2011) on the Webmaster Central Blog that’s basically a guide on building high-quality sites and he highly recommends that webmasters read this post.”

Watch the Webmaster Hangout

There you have it. There’s nothing to fix, but there is a lot of room to improve on. The blog post that John Mueller recommended contains a list of questions (not necessarily actual ranking signals) that would help you understand what Google thinks about when it ranks your site:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?

Those are just some of the questions that are listed in the blog post. Aside from originality and usefulness, another thing that came to mind while I was reading the blog post is that even before the term E-A-T was coined, Google was already treating it as an important factor for rankings. Successfully proving that whoever your site’s author is an expert that proves that the body of content is of value and that the facts in the content (and your site) are trustworthy – all of this equates to E-A-T.

How to Recover from Google’s Broad Core Updates

  • Use the Guidelines Provided by Google to Look for Inadequacies in Your Website

    • Use the guidelines (2011 blog post) to your site’s best interest. It already gives you a vision on the things that your site can do better on. Capitalize on that. It could take a lot of time and effort, but if you want to be successful in your SEO, then it’ll be worth it. Additionally, you can also read the updated Google’s Search Quality Raters Guidelines as well to deepen your understanding on Google’s standards for a high quality, useful website.
  • Improve your Site’s Expertise, Authority, and Trustworthiness (E-A-T)

    • There’s a lot of things to do here, but the first would always be to improve your author E-A-T. It’s a simple thing to talk about but difficult to do. You have to prove that you’re an expert in the area you’re writing about, and in recent times, medical and pharmaceutical websites have been the target of algorithm updates since not all of the content they have come from reputable or expert authors. Here’s a simple tip: If you start configuring your site to the guidelines highlighted in the webmaster central blog post, your E-A-T will improve as well.
  • Ask for Help

    • As the owner of a website, it is hard for you to see the faults contained in your website since you only have your own perspective to work on. But if you ask help from a community that shares the same interests or knows the same things that you do, then you can ask them to give their thoughts about your site, and you’ll be able to see the faults that you were never able to see before. The SEO community is a particularly large one and we have our fair share of intelligent and helpful people that are willing to give you their two cents. Don’t be afraid to ask for help from the community or anyone you know since it’ll help you grow as well.
  • Think Holistically

    • As I’ve mentioned, it’s a mistake to focus on a specific thing or factor when it comes to Google’s broad core updates. Sometimes, not focusing on a specific thing allows you to discover the reason why your site was impacted by the broad core update. Additionally, not focusing too much on the nitty gritty keeps you open-minded and allows you to consider many possibilities that help you diagnose your site’s traffic or ranking drop.

By 

is a motivational speaker and is the head honcho, and editor-in-chief of SEO Hacker and God and You. Check out his SEO School and SEO Services site.

Sourced from SEO Hacker

Amid a ferris-wheeling slew of scandals with respect to objectionable content across its air, YouTube has reportedly been developing a new algorithm to reward content of “quality.”

According to Bloomberg, YouTube has been developing two internal metrics over the past two years — one that is straightforward and gauges total time spent on the service (including posting and reading comments; not just watching videos) and a second that is slightly more nebulous, which the video giant is still working out. This second stat, per Bloomberg, is being referred to internally as ‘Quality Watch Time‘, and aims to identify content that is not only appropriate but constructive and responsible in some way.

Such an algorithmic development would presumably seek to help YouTube promote ‘quality’ videos — whatever that means — while marginalizing inappropriate videos and extremism. That said, the quality watch time metric could reportedly be used to calculate more than just video recommendations, according to Bloomberg, and is also being considered in realms like search results, ad distribution, and creator compensation. In prizing videos that are constructive and responsible, the metric would also help to combat the growing notion that YouTube is addictive and encouraging of mindless entertainment, according to Bloomberg.

YouTube has not finalized how quality watch time metric will be measured. And Bloomberg notes that coming up with a scalabe notion of ‘quality’ as ascertained by machine technology — or even human reviewers — feels like something of a herculean feat given the enormity of YouTube’s library and the variety of opinions out in the world.

YouTube told Bloomberg that “there are many metrics that we use to measure success,” but declined to comment on the development of either new metric.

YouTube implemented the ‘Watch Time‘ metric in 2012, replacing individual video views as its measurement of choice, despite the fact that critics both inside and outside of the company felt like such a shift could reward inappropriate and attention-seeking behavior, according to Bloomberg. YouTube declined to comment to the outlet about whether it might abandon watch time in favor of quality watch time.

Sourced from tubefilter

By Anna Crowe

There are plenty of rabbit holes to fall into when it comes to Google algorithm updates.

One of my favorites (after the unsolved mystery of Fred) involves the 2012 Google algorithm update on page layout and the above-the-fold movement.

Like the JCPenney link building scandal, there are still plenty of people who talk about this Google algorithm update — you just have to know who to ask. But you don’t have to go digging into the archives, we’ve got you covered here.

This post will tell you everything you need to know about Google’s page layout algorithm, how it has evolved over the years, and what it means to you now.

What Was the Google Page Layout Algorithm?

The arrival of a high-quality user experience and more sophisticated on-page SEO might have felt premature in the wake of 2011 Panda updates, but Google made it official on January 19, 2012: the page layout algorithm was here.

The page layout algorithm update targeted websites with too many static advertisements above the fold. These ads would force users to scroll down the page to see content.

Google said this algorithm would affect less than 1 percent of websites. But, the 1 percent of those sites affected were forced to design a better user experience.

This update didn’t include pop-ups or overlay ads.

Page Layout/Above the Fold Google Algorithm Update Timeline

Here’s a quick snapshot of how the algorithm has changed over time:

1. January 19, 2012: Page Layout Algorithm Launched

Google introduced the first-page layout algorithm update, also known as “Top Heavy” or the above the fold algorithm update, impacting sites that showed too many ads above the fold.

2. October 9, 2012: Page Layout Algorithm Update

Matt Cutts, then head of Google’s webspam team, announced on Twitter that the page layout algorithm had been updated. It affected 0.7 percent of English queries. The update also gave an opportunity to websites hit by the first Google algorithm rollout to potentially recover.

3. February 6, 2014: Page Layout Algorithm Refresh

Google released a refresh of the page layout algorithm on February 6, 2014. Cutts didn’t mention the impact on search results this refresh would have on websites. While it was a refresh, it appeared Google simply reran the algorithm and updated its index.

4. November 1, 2016, Automated Algorithm

John Mueller of Google announced in a Google Webmaster Hangout at the 48-minute mark:

“That’s pretty much automatic in the sense that it’s probably not live, one-to-one. We just would recalled this, therefore, we know exactly what it looks like. But it is something that’s updated automatically. And it is something where when you change those pages, you don’t have to wait for any manual update on our side for that to be taken into account.”

After Mueller’s comment, it seemed clear that Google would pick up on changes to your website and adjust rankings automatically after crawling your site.

Google’s Gary Illyes also confirmed the page layout algorithm is still a big deal, in March 2017:

Websites Impacted by the Page Layout Algorithm

With four updates to the page layout algorithm, it might seem like a significant change, but less than 1 percent of sites were affected by each of the updates.

These changes only affected people that had an excess of ads, to the point where user experience was weak, and it might benefit their site to re-think their web design.

Cutts explained it further:

“This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page.”

As a website owner, I can see where some people might have been upset. When it’s your page, with your ads, you’d think you should be able to organize it however you want and not be penalized.

The State of Local Search 2018: Expert Webinar
Join a panel of the biggest local search experts as we explore how the industry changed in 2017 and predict what search engines might have in store.

Besides, placing ads is how many of us keep our sites profitable. Lots of prominent ad spots will pull in more revenue…But, at what cost?

Some members of the WebmasterWorld forum saw an impact:

One forum member stated they were hit by the first algorithm update on February 7, 2012 and saw a 40 percent drop in traffic. They shared this image:

google-algorithm-page-layout

Another member said it took them two years to recover from the first algorithm update.

The truth is that people won’t want to visit a site if they feel berated with ads. They’ll get frustrated and look for whatever they were searching for – whether it’s birthday party ideas or expert golf advice – somewhere else.

Companies would instead put ads below the fold on a popular site (or fight for only a few spots at the top) then place them as one of many banner ads on a low-traffic site.

This change benefited the site owners, the ads, and the viewers.

Page Layout Algorithm Recovery

What is it about Google algorithm updates that send all of our thoughtfully crafted, tried-and-true, SEO strategies out the window?

It’s easy to get caught up in the mass hysteria of website redesigns, content rewrites, only to suffer from a penalty months later. All because of one Google algorithm update?

While only 1 percent were affected, what if you were in that 1 percent? Let’s take a look at recovering from the page layout Google algorithm update properly.

First things first, the above-the-fold area depends on what screen resolution your audience is using.

For example, a user on their phone would see a different above-the-fold space than if you’re looking from your laptop.

So, before removing any above-the-fold ads, use the Screen Resolution Tester Chrome extension. (Fold Tester is another free tool you can use to test your layout visually on the desktop.)

You can also review Google Code’s blog post that dives deep into the above-the-fold screen tests.

Google also provides an image to display what Google is looking for in the above-the-fold area:

google-algorithm-above-the-fold

If you do get hit with a penalty due to too many ads above the fold, you’ll have to wait for the Googlebot to recrawl your website. The good news is now that Google makes updates in real-time, you no longer have to wait until the next algorithm refresh happens.

The Final Impact of the Page Layout Algorithm

Dealing with a Google algorithm change like the page layout update can feel like falling into an abyss of constant web design changes over and over again. While I’ll be the first to admit that this update may have seemed harsh, Google’s message was clear: They value user experience.

Removing ads above-the-fold shouldn’t involve reducing your revenue. And, if I’m digging deep into the intent behind this algorithm update, it seems Google was laying the foundation for a mobile-only index.

Google instructed us to review ads above-the-fold on all screen resolutions, including mobile. With a smaller screen resolution on mobile and tons of ads, users would have to scroll down 5 to 6 times more than on a desktop. Without ads above-the-fold, users receive a better experience on desktop and mobile.

The truth of the matter is: Google’s page layout algorithm update was made for the user, not the webmaster. It’ll pay off in the long-run to be choosy about your advertisers.

Post-page layout algorithm change, the best sites that focus on the balance between ads and content, win.

We know, especially now, that there is a fine line between keeping a site profitable and keeping users engaged. This algorithm may have gone into effect years ago, but the lessons it taught us – and the problems it fixes – are still relevant.


Image Credits
Featured image: Created by Danny Goodwin
In-line post #1: WebmasterWorld
In-line post #2: Google

By Anna Crowe

By .

Nutella’s manufacturer Ferrero partnered with its advertising agency Ogilvy & Mather Italia to devise a plan to get people to buy more Nutella. Their idea? Have an algorithm design the packaging. The company provided the software with a database of patterns and colors that Ferrero felt fit with the hazelnut spread brand. It then created 7 million unique jars that were sold throughout Italy.

You can see some of the designs below. They look nice. I wouldn’t mind having them at home, although I don’t live in Italy, so that’s never going to happen. It was a limited project. Still, good job, software, you’ve got the mojo to make packaging for hazelnut spread. What a feat.

Ferrero
Ferrero
Nutella

By

Sourced from The Verge