By Ola King
While search marketers can get bogged down worrying about high quality content, successful link building strategies, and technically sound sites, when it comes to SEO, we need to take a step back and look at the what and why in order to get results.
To that end, Moz’s own Ola King walks you through the three main pillars, or as he calls them, “bosses”, of SEO work. All of your SEO strategies feed into their demands, but they all need different things.

Video Transcription
Hi, Moz fans. I’m Ola King. I work at Moz, and I’m excited to join you today for this edition of Whiteboard Friday. I will be talking to you about the three bosses of SEO.
Creating high quality content, making sure that you have a solid link building strategy, making sure your site is technically sound, these are great things to do when it comes to SEO. However, none of them would be as effective if you’re not taking a look at things from a strategic, wider lens. Basically, it means you have to take a step back and look at what you’re doing and why you’re doing them in order for you to get the results that you need.
So for SEO, there are three main pillars really to consider. I call them the three bosses of SEO. So that’s really your business, your searchers, and your search engines. Each of these bosses have their own individual needs.
Boss #1: Your business

So let’s start with the business. So these are the needs of the business. This is by no means a comprehensive list. I’m sure there are things that I’m missing. So if there are things that you think should be here, please leave a comment and we can have a discussion on that so we can all learn from each other. But the whole idea of this is to get you thinking about things from a broader lens before you dive into tactics.
Key metrics and goals
So the first one is the key metrics and goals. Any activity that is done without a goal is essentially a hobby, which is fine. However, if you want to do serious SEO work, you need to have a goal. In order to know what your goals are, I guess you have to look at your business goals.
Then that determines your marketing goals, which then determines your SEO goals. So understand what your KPIs are, understand what your priorities are, and that will then let you know what your next steps are. So, for example, if your goal is to get more traffic, you need to focus more on the top of funnel types of content, so like an ultimate guide for example.
If your goal is to get more leads, you might start looking at maybe your product comparison pages. Then if your goal is to have more sales, then it might be time to start optimizing your product pages for example. So always look at your key metrics and goals and then work from there.
Competitors
So the competitors is also something you should really consider. A lot of people are very familiar with who their direct competitors are in terms of product or services.
But when it comes to SEO, there is also the informational competitors, so people that might not be doing the same thing as you, but they provide information to your ideal audience. So always keep an eye on those competitors as well.
Resources
The resources. So look at the resources that you have in terms of time, budget, and personnel. If you don’t have the time for SEO, you might be able to consider outsourcing it. Or if you don’t have the right talent for link building, maybe you might want to partner up with an agency that does that. So always take stock of your resources before you start thinking of what you should do.
Brand identity + recognition
The brand identity and recognition also determines the types of content that you go after. It doesn’t matter if the content has a lot of volume and it’s trendy. If it doesn’t align with your brand in the long run, it’s not really a very good use of your time.
Area of expertise
The area of expertise as well is very much related to this. So what are you an expert at? Try to lean on your expertise. If you don’t have the expertise but you want to provide that information to your audience, maybe you might want to collaborate with other people that are better suited to that so that you can still complete your goal for your business and audience.
Strengths
Strengths is very related to expertise, but this is in terms of what talents, what skills do you have. Are you better at doing research and creating long-form content, or are you better at creating things that go viral and are more like listicles? Lean into your strengths and collaborate as needed with people that can help you with your weakness.
Time in business
The time in business also the time is the approach you take for SEO. A brand-new website, what you would need would be completely different from a business that has been around for a long time, that has a great website, but they’re just trying to do a refresh, which is also different from a business that has been around for a very long time but doesn’t have a very good online presence.
All of this would affect the way you approach content, link building, and trying to rank for those tough content. So that’s your business. As I mentioned, I’m sure there are things I’m missing. So I’m very curious to know the other things that you might come up with as well.
Boss #2: Searchers

So next up let’s look at the searchers. So these are the people that you are serving as a business. The first thing, when it comes to the searchers, is look at your persona. So what are the types of people that you’re trying to attract into your website? There is no point in creating any piece of content if you don’t even know who you are trying to attract with that content. So start with the persona.
Search intent and relevance
Once you’ve identified the persona, you can then start looking at the search intent and relevance.
So what are they looking for? The good news is the answer is already right on your search engine results pages. Do a quick search for your ideal keyword and you’ll be able to see the results that the search engines have deemed as the most appropriate for what your audience is looking for, which matches the search intent. Once you’ve done that, then you’re going to want to create the right content to satisfy the searcher’s intent.
Topics, not keywords
When you’re creating content, focus on topics and not keywords. So gone are the days where you just want to create your page and stuff it with as many keywords as you can and you start ranking and print out dollars. Not so effective anymore. You basically want to look at each page on your site covering a topic that you have a focus.
While you’re doing that, then you want to make sure that you have the most comprehensive page that answers that searcher’s intent. Cyrus Shepard actually has a great
Sourced from PPC.Land A new industry report published this week by Retail Economics, Amazon Web Services, Botify, and DataDome has put a precise number on how dramatically artificial intelligence has disrupted the underlying mechanics of retail discovery – and it is a number that should concentrate minds across search, e-commerce, and advertising alike. For every single visit OpenAI’s systems deliver to a retail website, those same systems perform 198 crawls. Google, by comparison, generates one visit for every six crawls. The disparity, drawn from analysis of approximately 200 retail and e-commerce websites, illustrates how AI platforms interact with the web in a fundamentally different way from the search engines that have shaped digital marketing for the past two decades. The report, titled “The Future of Search and Discovery: A strategic playbook to understand agentic commerce,” is based on a consumer survey of 6,000 nationally representative respondents across the UK, US, and France, conducted in November 2025. Its conclusions range from quantitative measurements of bot traffic growth to qualitative assessments of consumer trust, and it arrives at a moment when agentic commerce infrastructure is being built at pace across every major platform. The headline infrastructure finding is stark. According to Botify’s analysis, AI-driven bot traffic across the approximately 200 retail and e-commerce websites examined increased 5.4 times during 2025, with the index moving from a baseline of 100 in the first quarter to roughly 640 by the fourth quarter. The growth was not linear. A particularly sharp acceleration occurred in the weeks preceding September 2025, when crawl intensity rose sharply as AI systems refreshed and ingested product data. Shortly after, OpenAI expanded its commerce-related capabilities, including agent-led shopping and in-chat purchasing features. Visits from OpenAI to retail websites then increased 200% month on month in September 2025 following that rollout – a direct illustration of the relationship between platform-level capability updates and referral traffic patterns. PPC Land reported on OpenAI’s instant checkout launch on September 29, 2025, covering how the Stripe-backed Agentic Commerce Protocol enabled direct purchases within ChatGPT conversations. The category-level picture is even more granular. According to the report, food and grocery experienced a 29-times increase in AI-driven bot traffic over the course of 2025, driven by the high volatility of prices and stock levels that make the category valuable for AI systems to monitor continuously. Home and DIY saw an 11-times increase. Electronics and appliances also crawled significantly. The divergence reflects a structural insight: AI systems treat retail categories differently based on how frequently data changes, not purely on commercial significance or retailer performance. The scale and velocity of this automated traffic introduces a measurement problem that retailers have only begun to grapple with. According to the report, AI bot systems generate high-volume, concurrent requests that are not always distinguishable from human browsing in traditional analytics. The consequences are concrete. When Google removed a technical shortcut – the Perhaps the most operationally urgent finding in the report concerns security. DataDome, the bot and agent trust management company that co-produced the research, analysed 698,214 live websites using a spoofed “ChatGPT AI assistant” user-agent. The result: 79.7% did not block or challenge the impersonation attempt. Of those, 79.2% returned a “200 OK” response code, meaning the spoofed agent was admitted without challenge. Only 17.2% returned a “403 Forbidden” response. This is not an abstract vulnerability. According to the report, spoofable user agents and incomplete IP lists make it difficult for retailers to distinguish legitimate AI agents from stealth or human-driven automation using shared infrastructure. The practical effect is that malicious actors can clone weakly declared AI agents to exploit pricing, inventory, or checkout flows. The report notes that DataDome’s threat research team, Galileo, recently identified that 80% of AI agents do not declare themselves properly when visiting websites. That figure underpins a broader argument that retailers face “skewed performance metrics that undermine commercial decisions and expose them to fraud.” The risk is asymmetric. Blocking all AI traffic to protect against spoofing carries a different cost: if brands do not allow AI bots to find and use content on their websites, according to the report, those systems may find data elsewhere – from third-party review sites, forums, or competitors. PPC Land has tracked how Amazon chose the restrictive path, blocking AI bots from OpenAI, Anthropic, Meta, Google, and Huawei in August 2025, a strategy that runs in parallel with Amazon’s development of its own proprietary AI shopping tools. The visit-to-crawl ratio is worth dwelling on. It signals that for OpenAI’s systems, the primary purpose of engaging with retail websites is not delivering visitors but rather ingesting, validating, and comparing information within their own interfaces. Discovery and evaluation increasingly happen inside AI interfaces before a consumer ever reaches a retailer’s site. This challenges the foundational assumption of SEO: that being crawled translates, over time, into being visited. The report frames this as a shift in where influence operates. According to Botify’s data, Google drives one visit per six crawls, compared with one per 198 for OpenAI. In practical terms, a product that ranks highly in Google search still generates traffic directly. A product evaluated by an OpenAI agent may shape a recommendation without ever producing a referral visit. Conversion attribution, session metrics, and bounce rate become less meaningful as a result. Brainlabs reported earlier in 2025 that AI search visitors can be worth 4.4 times more than traditional organic traffic, but that premium depends entirely on the visitor arriving at a website in the first place – an outcome the 1-in-198 ratio suggests is far from guaranteed. The report introduces a taxonomy of AI-led traffic that distinguishes between training crawlers (such as GPTBot from OpenAI and ClaudeBot from Anthropic), live retrieval crawlers (such as ChatGPT-user and Perplexity-user, which fetch fresh content in real time), index-building crawlers (such as OAI-SearchBot and PerplexityBot), AI assistants and shopping agents (such as ChatGPT, Microsoft Copilot, Gemini, and Amazon Rufus), agentic browsers (such as Perplexity Comet, ChatGPT Atlas, and Gemini integrated into Chrome), and malicious or exploitative bots(unauthorised scrapers, competitive intelligence bots, and automated fraud traffic). Each category carries different implications for governance and access policy. PPC Land reported on OpenAI’s revised ChatGPT crawler documentation in December 2025, which created different compliance standards for different crawler types. A separate technical finding deserves attention among search and e-commerce professionals. According to the report, most AI bots cannot read content rendered in JavaScript. If a brand’s product data – pricing, availability, specifications, reviews – sits behind JavaScript, AI systems will see only a stripped-down version of the page. The report illustrates this with a comparison: a shoe product page viewed by a consumer shows size, colour options, materials, price, and promotional details; the same page seen by most AI bots shows only a handful of visible text labels and a stripped visual shell. The consequence is direct. If AI systems cannot access or interpret a retailer’s data, that retailer may never appear in AI-mediated discovery. The report places structured, authenticated, and accessible data at the centre of its five identified forces of disruption, alongside discovery shifts, infrastructure requirements, LLM evolution, and measurement change. Poor metadata or inconsistent taxonomies can make products invisible to AI crawlers entirely. PPC Land reported in December 2025 on Google’s documentation clarifications around JavaScript rendering for error pages, reinforcing the same underlying technical vulnerability. The report identifies Answer Engine Optimisation (AEO) as the growth layer built on top of traditional SEO. Traditional keyword rankings, organic impressions, click-through rate, domain authority, and bounce rate – the standard dashboard of digital marketing performance – were built for a world of links and human clicks. They do not show how AI agents see, interpret, and act on content. The report proposes a new generation of performance metrics: agent inclusion rate (what proportion of products or pages are recognised and surfaced by AI agents), discovery visibility (presence rate across multimodal environments), engagement confidence index (how often consumers act on AI-surfaced results), structured-data coverage, trust signal strength, visibility-to-sale ratio, and discovery ROI index. These are emerging standards, not yet widely deployed, but the report argues they are necessary to understand commercial impact in an AI-mediated environment. An SEO expert released a related AI search content optimisation checklist in June 2025 that addressed similar requirements around server-side rendering and structured data coverage. The consumer survey component of the report draws from 6,000 respondents across the UK, US, and France surveyed in November 2025, with 2,000 per country. According to Retail Economics, 73% of consumers across the three markets have consciously used AI in some form over the past twelve months. Of those, 38% have used AI assistants specifically for shopping tasks including product ideas, suggestions, or comparisons. A further 34% have used AI features on retailer websites or apps. Twenty-one percent have used AI tools to make decisions or support purchases. The US records the highest adoption rate at 73%, with France at 69% and the UK at 68% – closer to each other than might be expected given differences in digital culture. Among 18-to-24-year-olds, approximately one in four use AI assistants regularly and one in five use them day-to-day. Among those aged 55 and older, fewer than one in ten report day-to-day use. The gap widens further when examining AI use relative to other discovery channels: among 18-to-24-year-olds, AI assistants and social discovery channels exert influence that matches or exceeds traditional search engines in the discovery phase. Trust, however, tells a different story. Thirty-two percent of consumers across the surveyed regions say they do not trust AI-enabled search and discovery. Whereas 38% feel comfortable with recommendations from tools like ChatGPT, Microsoft Copilot, and Gemini, far fewer are willing to let those systems act on their behalf. Nearly half – 49% – say discovery is something they want to do themselves, not something to outsource. The report describes this as a “key tension in the shift towards agentic commerce: people value the benefits afforded by AI, but don’t yet feel fully confident to delegate decisions.” The trust gap is structured by age and income. Higher-income consumers exhibit greater confidence in AI systems, likely reflecting greater familiarity from work settings. Middle-aged, high-affluence consumers emerge as the most AI-trusting segment. Least affluent consumers show the lowest trust, where concerns about risk, accuracy, and control are most acute. The report maps retail categories by consumer trust in AI-led discovery and willingness to use AI, weighted by typical spend. Electronics and appliances consistently lead across all three markets. Purchases in this category involve technical specifications, rapid product cycles, and meaningful price differences – exactly the conditions where AI assistance in comparison and shortlisting is most valued. Travel and leisure sits close behind. Clothing and footwear shows rising exposure, with large online ranges and frequent browsing creating fertile ground for AI-led personalisation. Categories sitting lower on both axes include jewellery, beauty, and homewares – purchases that involve emotional, tactile, and personal judgements where consumers still seek human reassurance. Food and grocery shows strong regional variation: the US shows higher openness to AI assistance in grocery discovery, while France reflects a stronger food culture centred on freshness and physical inspection. Shopping missions follow a parallel gradient. According to the survey, consumers show the highest willingness to delegate to AI for “considered or technical purchases” and for “buying gifts for others” – both missions involving uncertainty, high information load, and benefit from structured comparison. Routine replenishment sits at the bottom of the willingness scale across all three markets. The pattern is consistent: AI assistance is welcomed where decisions feel cognitively demanding, and resisted where habitual or emotional judgement dominates. Amazon Rufus provides a commercial datapoint that anchors these projections. According to the report, more than 250 million customers used Rufus during 2025, with interactions up 210% year on year. Customers who use Rufus while shopping are over 60% more likely to make a purchase during that session. Amazon’s full-year financial results subsequently confirmed that Rufus generated nearly $12 billion in incremental annualized sales during 2025, with more than 300 million customers using the tool throughout the year. The report identifies four distinct shopper personas in relation to AI-assisted discovery. AI-first optimisers (10% of the total, skewing younger at an average age of 38) use AI assistants as their primary discovery tool and show 47% complete trust in AI for research and comparison. Assisted explorers (55%, average age 42) welcome AI as a practical co-pilot for shortlisting and comparison but want to remain in the approval loop. Guarded adopters (16%, average age 53) use AI in controlled, low-risk ways but scrutinise results and hesitate before delegating meaningful decisions. Human loyalists(19%, average age 62) rarely use AI for shopping and require concrete evidence of benefit before adopting more meaningfully. The strategic section of the report organises its recommendations into three readiness workstreams. The first concerns traffic policy for AI bots and agents – establishing which systems should be allowed, blocked, limited, or monetised, with continuous trust assessment and dynamic behaviour-based security. The second concerns data readiness and product information management – standardising product attributes, metadata, and taxonomy to create a single machine-readable source of product truth, and testing how AI crawlers actually extract and interpret that data. The third concerns on-site AI experiences – building conversational, voice, and embedded-agent user experiences that complete the discovery-to-purchase loop without losing the customer to a competitor’s AI interface. Cloudflare’s launch of pay-per-crawl in July 2025 and its subsequent Markdown for Agents service in early 2026represent infrastructure-level responses to exactly these workstreams, creating mechanisms for retailers to control and monetise AI access to their content while reducing the token cost of that access by approximately 80%. The report concludes that early-mover advantages are emerging, but brands that delay action risk becoming harder to find, harder to trust, and easier to replace. The age of agentic search and discovery, it argues, will arrive gradually – but the transition is already underway, and hastening. Who: Retail Economics, Amazon Web Services, Botify, and DataDome published the report. The consumer research covers 6,000 nationally representative consumers in the UK, US, and France. Key data contributors include Botify’s analysis of approximately 200 retail and e-commerce websites, and DataDome’s security test of 698,214 live websites. AJ Ghergich, Global VP of AI at Botify, is available for comment on the findings. What: A 35-page strategic report measuring the scale and commercial implications of AI-driven crawling and agentic discovery in retail. Core quantitative findings include: AI bot traffic grew 5.4 times during 2025; OpenAI generates 1 visit per 198 crawls compared to Google’s 1 visit per 6 crawls; 79.7% of websites are unprotected against AI agent spoofing; 73% of consumers have used AI in some form; and 38% have used AI specifically for shopping tasks. The report introduces a new taxonomy of AI traffic types and a set of next-generation performance metrics for the agentic era. When: The consumer survey was conducted in November 2025. The bot traffic analysis covers the full calendar year 2025. The report was published today, March 7, 2026. Where: The report covers retail and e-commerce markets across the UK, US, and France for consumer data. The bot traffic analysis draws from approximately 200 retail and e-commerce websites globally. The security analysis of agent spoofing covers 698,214 live websites internationally. Why: AI systems now function as gatekeepers between brands and consumers, shaping consideration sets before shoppers ever visit a retailer’s site. The combination of rapidly escalating bot traffic, widespread vulnerability to agent spoofing, and the invisibility of JavaScript-rendered content to most AI crawlers creates material commercial risk for retailers who have not yet adapted their data infrastructure, traffic governance, and measurement frameworks to the agentic era. The report argues that early-mover advantages are already emerging and that delay increases the risk of being excluded from AI-mediated discovery entirely. By Karan Sharma If you think that just having a good website for your business can bring in customers, it’s time to rethink. Yes,… By Jon Stapley One of the most iconic marks ever created, Pepsi’s logo has been on a fascinating and tumultuous journey spanning more than a…AI bots crawl retail sites 198x more per visit than Google, bot traffic rose 5.4x in 2025, and 80% of websites are exposed to agent spoofing, per new research.
Bot traffic has multiplied – and skewed analytics
&num=100 parameter – that many tracking tools relied on in early September 2025, Botify’s enterprise retail clients reported search impressions fell by approximately 67%, while clicks stayed largely flat and average position appeared to improve. Click-through rate growth then increased by approximately 150%, not because performance had genuinely changed, but because the data was no longer contaminated by synthetic bot impressions. Much of the apparent growth in impressions had been driven by AI bots capable of making 100 or more simultaneous requests, not by real consumers.Nearly 80% of websites exposed to agent spoofing
The 1-in-198 ratio and what it means for discovery
JavaScript invisibility and the structured data imperative
Consumer adoption: 73%, but trust lags
Which categories and missions face the earliest exposure
Four consumer personas and three readiness workstreams
Timeline
&num=100 tracking parameter, causing apparent 67% drop in search impressions.
Summary
Sourced from PPC.Land
Psychological and socioeconomic factors
So when you’re creating your content or you’re trying to devise your content strategy, always look at the emotion, psychology, social, and economic factors that are affecting your audience. It’s easy to look at data on your site’s traffic and obsess about what could have gone wrong in terms of your competitors or other factors. But you might also want to take a step back and look at what’s happening in the lives of your audience, like what are they struggling with right now.
So in the past 18 months also, every one of us have been experiencing the pandemic. So that has changed the way people search for things. Searches for keywords like remote, things like delivery, those searches have gone up over the past few months, and that’s based on the social factors that are affecting people. It means they can no longer do things that they were able to do before, so now they’re having to adjust in different ways. So always look at what’s happening to your audience and then react accordingly.
Brand affinity and trust
The brand affinity and trust also affects the way people interact with your site. If people are familiar with a brand, they are more likely to trust them and interact with them more.
So if you’re a newer website or a brand, it might be a good idea to let the content speak for itself and not try to make your brand the front and centre of attention. Whereas for a bigger brand, it might be a good idea to do the opposite. So a site like Amazon would do good to have their brand name in the title tag for example because people know their brand and they can trust them and click on the site, whereas a brand-new website it might be a good idea to not necessarily make that the focus of attention.
Trends and seasonality
So other things to look at are trends and seasonality. As you’re looking at your SEO data, if you notice a dip, you might not be doing anything wrong. It could just mean that it’s the nature of the time of the year. So I’m sure certain keywords would trend upward around the holiday season, for example, for things like electronics, video games, etc.
Then towards like February or March, maybe those searches might reduce. It doesn’t mean you’re doing anything wrong. It’s just the seasonality.
Search behaviour
So the search behaviour as well. People’s behaviour changes over time. Humans are not robots. They are very dynamic.
Things change, things that they search for. As I mentioned before, when their emotional, psychological, social, and political factors are affected, it also changes how they search for things as well. So always try to react to that or pay attention to what people are doing. Try to understand what’s changing in their search behavior and react to that accordingly.
Customer journey
The customer journey is very important. Always understand the touch points that your customers have with your business. Even outside of your business, look at their journey before they get to your business. This allows you to know the types of content you need to create to fill in the gap in their journey. This allows you to know who you might need to collaborate with, so other information sources that your audience has, where they hang out. You are able to understand those things and be able to create the perfect content for them and also promote it in the right places as well.
Struggles
The struggles. What are the things keeping your audience up at night? What are they struggling with? Understanding this allows you to create content that no other person would be able to create. It would almost be to them like you have like some magic wand where you’re able to predict what’s going on with them.
Try to understand what are their struggles. You can find out the struggles by looking at questions that your audience asks your help team, for example. That’s a good place to start and use SEO tools to do your keyword research to know what some of those questions that they’re asking, that indicate struggles. Go on forums like Quora and Reddit. Those types of places allow you to find those struggles.
Location and language
Location and language affects how people search for things. Different locations have their own slangs, have their own culture, behaviours, and ways of doing things. Try to understand the location that you’re targeting. Try to understand what the culture is like, what the language is, and try to create your content with that in mind. If you don’t have that expertise or knowledge, it’s a good idea to partner up with someone in those locations as well.
Also make sure that your site is internationalized as well if you’re targeting multiple countries. There are lots of resources that teach you how to do this. You can find that in the Moz [SEO Learning Center] as well.
Accessibility
Accessibility, different people search for things in different ways. People have different needs. So make sure that your site is universally accessible to everyone. So make sure it’s mobile friendly. Make sure you don’t have like annoying pop-ups everywhere. Make sure that you provide an alt tag for your images to make your content more accessible to all.
So these are the factors that are affecting the searchers. There’s a lot that I probably missed, so I would love to know what you think and also other ones that I forgot.
Boss #3: Search engines

So the last but not the least is the search engines. In order to win for SEO, you really need to understand that the search engines are businesses as well.
Business model
So in order for them to rank your site, you have to be a site that is in line with their business. For Google, if you want to understand what their business model is, there is a video on YouTube that you should watch.
It’s called “A Trillion Searches, No Easy Answers.” It’s a very interesting video that shows you the behind the scenes of how they think about things, what challenges they have, and the future of where they’re heading. This would then allow you to be able to know where they might go next so that you can react accordingly.
For Google, once again, I mean ultimately they are just trying to provide content to their searchers that is valuable, that is from sites that are indexable, that provides a good experience, and of course it has to be relevant content.
Natural language processing
They put a huge emphasis on relevant content. That leads us to the next one — NLP. So every additional change that Google has been making over the past few years is geared towards that goal of helping people get answers to things that they search for in a natural way, so making search basically more human.
That allows them to be able to help people find the relevant content to them by using more advancements in machine learning. So in order for you to do well for SEO, you need to understand what are they doing with these updates. Read the release notes. Try to understand what each update means and then try to cater your content to match that goal as well.
E-A-T
E-A-T, it means expertise, authoritativeness, and trust. Google is very strict on this when it comes to sites that are in the money or your life categories. So that’s health, finance, and fitness, things like that. So make sure that your site is displaying the signals that they need for this authority.
There are a lot of resources out there. I wish I could spend more time to explain this, but we have limited time. But make sure you look into this so you can follow the right guidelines for the E-A-T.
Links
The links, I don’t need to explain this too much. Everyone that works in SEO is pretty much familiar with this. But links are basically the digital word of mouth. A lot of people are familiar with getting backlinks.
But just as important to getting backlinks, you also want to make sure that you’re spreading internal links as well. So make sure that the pages on your site that are getting high traffic, you are also linking to pages on your site that might not be getting as much traffic, but they are just as important to you.
Core web vitals
This is a recent update, the Core Web Vitals. So it’s meant to basically build better websites in the world. A lot of people debate the effectiveness of this at this very moment. I would say you should do your best. Use tools like the Moz Performance Metrics Beta and try to improve your site as best as you can to at least be prepared when these changes do start affecting your ranking power.
Indexability
Indexability, of course make sure your site is indexable to the search engines. So the things like your robots.txt file is well set up. Make sure that there are no HTML or JavaScript errors. Make sure that you are reducing pages on your site that have no value so that you’re not taking away from that crawl budget for the most important pages. Look at your site’s architecture. Make sure things are set up correctly so it makes your site very indexable.
Schema markup
Take advantage of schemas. These help the search engines understand your website very clearly. Having schemas doesn’t mean you would always win the SERP features, but at least it gives you a fighting chance. So take advantage of them as well.
Query deserves freshness
QDF is “query deserves freshness”. So for certain queries, the search engines determine that more up-to-date information is more relevant than other types of content, so they refresh them more frequently. So if you notice that some of your content did not perform quite as well, it might just be because that they are outdated.
So a little quick refresh can help you take advantage of the opportunity to rank better.
Ongoing updates
Last but not least, ongoing updates. SEO is not stagnant. It’s continuously dynamic. It’s moving, and things are changing. All the search engines are pushing dozens of updates on a daily basis.
So keep an eye on, like I said, their business model, try to understand where they are headed, and try to be able to predict where they’re going. Keep on top of the updates and then adjust as you go. But yeah, so these are the three bosses of SEO, and these are all what they need.
As I mentioned, I probably missed a lot of things. But the whole idea is not for this to cover everything. The idea is just getting to think of SEO from a very holistic perspective. You might be wondering this is a lot. Where do I even start from? Well, the most important thing is your business. Try to make sure that you’re doing the right thing for your business.
Then make sure you do the right thing for your searchers and then start satisfying the search engines to get results. But yeah, so that’s all I have for you today. Leave your comments below. I would love to have a discussion with you and see what we can learn from each other as well. All right. See you next time.
Video transcription by Speechpad.com.