This article explains what “AI search optimisation” actually means for your Shopify store in 2026, and what you can do to prepare.

Search is no longer just a list of blue links. Google increasingly answers questions directly, summarises options, compares products, and pulls key details into AI-generated overviews, and for the first time in a long time, there is competition for search traffic on platforms like ChatGPT and Perplexity. Soon, the default search in Google’s Chrome browser will generate AI answers instead of giving you a list of websites to choose from. This does not mean SEO is dead, but the type of SEO that works is changing. Visibility now depends not only on whether you rank, but on whether your site is easy for algorithms to interpret, summarise, and trust.

For Shopify stores and Shopify agencies alike, that shift has a practical implication: your website needs to provide clear, structured, dependable information, and we as an agency need to guide you towards achieving that. The sites that win are the ones that make it easy for Google and other AI systems to understand what you do, what you sell, and why you are credible.

AI search rewards clarity

A lot of SEO advice is still rooted in the idea of gaming rankings. AI-driven search does the opposite, as it considers the page content as a whole in context, and so skips your carefully constructed user journey and liberally sprinkled keywords. It tends to amplify sites that are easy to parse and hard to misunderstand, with clear information and advice that it can serve up.

On Shopify, that begins with page structure. AI systems extract meaning from headings, repeated patterns, and consistent information placement. A product page with a clear structure, well-labelled content modules, and accurate data is much easier to summarise than a page where key information is buried in a long, unstructured description.

The same applies if you offer services instead of products, AI systems need to understand what they are, your process, your proof, and your specialisms quickly. Pages that are vague or overly salesy can be harder for AI systems to confidently cite because the information is not explicit.

That sentence you see at the top of this article which explains exactly what it is about is part of this strategy – even AI will take the shortcut and read the first two lines of a page when making a judgement on whether to crawl it.

Structured data and schema are more important than ever

Schema has been important for years, but AI-driven visibility moves it from “desirable” to “essential”. Structured data provides explicit cues about what a page contains, making it easier for AI to understand it, and why the content (and company) are a good recommendation for users. For a Shopify store, product schema tells AI search what the product is, its price, availability, variants, and reviews. For content pages, FAQ schema can help highlight clear answers. Breadcrumb schema clarifies site structure and hierarchy.

Shopify provides some schema by default, but themes vary. In many cases, improving schema is one of the most direct ways to make a store more “AI readable”. It reduces ambiguity and makes it easier for systems to extract accurate information without guessing.

However, the key here is quality over quantity. Schema are designed to fulfil a purpose, so can’t be applied to every page in a blanket approach. The schema in use must match the page, or incorrectly marked up content will backfire. Good schema should reflect real content and real structure, and it’s up to your agency to provide recommendations on which schema to use, and where.

Consistent content modelling beats one-off page hacking

AI systems prefer predictable patterns. That is one reason metafields and modular sections matter. If your product pages all present materials, sizing, shipping, and FAQs in a consistent way, the store becomes easier to interpret at scale. If every product page is built differently, the site becomes harder to summarise reliably.

This is not only a technical issue. It is an operational one. Merchants often struggle to maintain consistent product content when the Content Management System powering Shopify does not support structure. A good Shopify build makes it easier to maintain high-quality data, and that data quality feeds visibility.

In practice, the stores best positioned for AI-driven visibility are the ones with strong content governance. Their product information is complete. Their collections have useful context. Their editorial content supports real customer questions.

Build pages around real questions and decisions

AI overviews tend to surface content that answers questions. eCommerce brands can take advantage of this by creating supporting content that genuinely helps buyers make decisions. Buying guides, comparisons, sizing advice, care advice, and “why this matters” explainers give AI systems useful material to cite.

For agencies, the same principle holds. Content that explains the difference between theme types, the trade-offs between apps and custom code, how migrations work, and what makes a Shopify build performant tends to be referenced because it answers common business questions.

This is where “topic clusters” remain powerful. A pillar page and supporting articles create a web of context that both humans and machines can follow.

Technical hygiene still matters, because AI still crawls the web

AI-driven search does not remove the fundamentals. Crawlability, indexation control, canonical handling, and site speed all still matter. In fact, speed and stability may matter more. AI systems are less likely to cite pages that feel unreliable, slow, or inconsistent.

Shopify stores often develop technical clutter over time. App scripts accumulate. Tags are duplicated. Themes get extended without restraint. Keeping the site technically clean helps not only performance, but also interpretation.

Brand proof signals build essential trust in you

One of the less well-known factors in AI summarisation is that systems tend to favour sources with recognisable authority signals. That means proof matters, and if it’s third-party verified, even more so. For Shopify stores, this proof will be found in reviews, policies, clear contact details, structured business information, and high-quality relevant and informative content. 

It is not enough to claim that you are specialised in what you sell. You need supporting content that demonstrates it, and you need signals that your business is real, credible, and experienced.

What to do now: a practical approach

The best way to prepare for AI-driven search is not to chase gimmicks, it is to do the things that make your site clear and robust. Often, you may have been aware of the shortcomings for years, but have put them on the back burner as they don’t materially affect how your site works, or how you do business.

Here’s a checklist of the actions we recommend:

  • Improve structured data
  • Tighten page structure 
  • Build consistent product content through metafields
  • Create supporting articles that answer real questions
  • Keep the site fast and stable
  • Make proof visible

If you build these changes into your structure, you will not only be better positioned for AI Overviews, you will also have a better store.

AI search optimisation conclusions

AI search optimisation for Shopify is not a separate discipline from good SEO and good UX. It is, in many ways, the logical conclusion of both. AI systems reward clarity, structure, and credibility. Shopify stores that are built with those principles in mind will be easier to crawl, easier to interpret, and easier to surface in modern search experiences.

If you want your Shopify site to remain discoverable in a world where search results increasingly summarise the web rather than simply listing it, the path is straightforward: structure your content, strengthen your technical foundations, and create the kind of information that deserves to be referenced.

Webselect can help you to achieve, so if you’d like to find out more, get in touch.

This article is written for merchants and marketers who want clarity on what matters for Shopify SEO, and for anyone considering working with a Shopify agency and wanting to judge whether they have a proper SEO method.

Shopify is often described as “SEO-friendly”, and in a general sense that is true. You get clean URLs, sensible defaults, and a platform that is stable and secure. But SEO results in 2026 are rarely earned through defaults. They come from structure, consistency, and a store that is built in a way search engines can understand easily. They also come from making deliberate choices about content, internal linking, and technical hygiene – especially now that Google’s results increasingly blend classic rankings with AI summaries and richer SERP (Search Engine Results Page) features.

A good Shopify SEO approach is not a one-off execution of a bag of tricks. It is a repeatable system that builds authority and trust factors into content, and works to build your position for the keywords that matter. What follows is the playbook we use when the goal is not “a few improvements”, but sustained visibility for commercially valuable terms. 

Start with your commercial target, not the keyword list

The first mistake we see is approach-by-spreadsheet: hundreds of keywords, but little commercial logic. In eCommerce, the best SEO starts the other way around. You begin with what you sell, how people shop for it, and what the highest-value journeys look like. We work with you to understand that, and provide strategy to position your brand for the keywords which matter. That becomes the basis for your information architecture, your category strategy, and your content plan.

On Shopify, this matters because the platform’s structure nudges sites into patterns. Collections become categories, and products become the decision pages with blogs and editorial content as the supporting layer. You want that structure to map cleanly to how customers search, otherwise you end up with messy navigation, thin category pages, and content that competes with itself.

If your goal is to rank for something broad and competitive like “luxury food”, the same principle applies: you need a page architecture that clearly signals expertise, services, and supporting evidence. That is a positioning challenge as much as a technical one.

Get the foundations right: site structure that search engines can interpret

Most Shopify SEO problems are not difficult to diagnose. 

They typically come from:

  • Unclear structure 
  • Collections that overlap
  • Tags used as pseudo-categories
  • Duplicate routes to the same product
  • Landing pages that exist for campaigns but are never integrated into the site’s internal linking system

A cleaner Shopify structure is to have a small number of strong collection hubs, supported by sub-collections or well-organised product groupings, with internal linking that makes relationships obvious. Your navigation should be built for humans first and foremost, but when it is built well, it also helps crawlers understand what the site considers important.

Early on in the process, we also look at how product types are structured. If your store has very different product categories, they may need different product templates, different content modules, and different supporting content. Shopify Online Store 2.0 makes this easier, but it has to be planned.

Fix duplicate content and canonical confusion before you write more pages

Shopify stores can generate multiple URLs leading to the same content. This does not always cause disaster, but it can muddy signals and cause ranking volatility. A common example is product pages accessed through different collection paths. Filter parameters can also create messy crawl behaviour if they are not handled properly.

A proper SEO playbook includes identifying where duplicate routes exist and ensuring canonical usage is sensible. It also includes making sure indexing behaviour is deliberate. Not every URL Shopify can generate needs to be crawled (or indexed), and identifying the ones which can be safely removed from the site’s architecture is a science.

There is also a more subtle kind of duplication: internal duplication caused by content strategy. If you create multiple pages targeting the same intent, you risk keyword cannibalisation. This is particularly common with blog content that repeats category targets instead of supporting them. We can help you to consolidate and build on their performance so that nothing is lost.

Build category pages that can rank, not just list products

One of the biggest missed opportunities in Shopify SEO is the collection page. Many stores treat collections as simple product grids with a short intro line. That might be acceptable for very low-competition niches, but in most markets it is not enough. Collection pages are the pages most likely to rank for high-intent commercial searches, and they need to earn that visibility.

A strong collection page does a few things well. It explains what the category is and who it is for. It provides guidance that helps shoppers choose. It includes content modules that answer common questions. It has internal links to relevant subcategories and buying guides. And it is structured with clear headings and clean markup.

This does not mean turning collection pages into essays. It means giving the page enough substance and clarity to justify ranking, while keeping the shopping experience front and centre.

Product page SEO is about clarity

Product pages should be built for conversion, but they also need to provide structured information that search engines can interpret. In 2025, this is more important than ever. AI systems and richer search results rely on clear product data: what the product is, who it is for, what makes it different, how it fits, what it is made from, how it ships, and how it is returned.

This is where metafields can transform product SEO. Instead of stuffing everything into a single description field, structured fields allow a theme to present information consistently. That consistency is good for users. It is also good for search engines because the page becomes more predictable and semantically clear.

We typically advocate for product pages that are built around modular content sections. The hero area should present key fundamentals cleanly. Supporting modules can handle size guidance, materials, care, shipping, FAQs, and trust signals. The goal is to make it easy to understand and easy to buy. When those fundamentals are right, they’ll guide you by design towards producing optimised content.

Protect performance through technical SEO

Technical SEO on Shopify is often less about complex server configuration and more about avoiding accidental mess. Common technical priorities include making sure the site is fast, ensuring templates output sensible heading structures, keeping internal links tidy, and avoiding a bloated script environment that slows rendering.

Speed matters for SEO and conversion. So does stability. If a theme is constantly shifting or breaking due to app conflicts, the store becomes harder to crawl and harder to trust.

We also pay attention to redirects and migrations. Shopify stores frequently go through redesigns, and careless URL changes can wipe out years of earned visibility. A competent Shopify agency treats redirects as part of the project plan, not as a last-minute checklist.

Web page schema is essential for modern search visibility

Schema markup helps search engines understand content explicitly. For eCommerce, this is often the difference between plain results and rich results: product price, availability, review stars, FAQs, breadcrumbs, and more.

Shopify provides some structured data, but themes and apps vary widely in quality. We often audit schema and improve it at three levels. Product schema should be accurate and complete. Collection-level signals should be consistent. Editorial content should use appropriate article and FAQ markup when it genuinely fits. Breadcrumb schema can also help clarify site hierarchy.

Schema is not a magic wand, but it is a quality signal, and it is particularly relevant as AI-driven search becomes more prominent.

Content strategy: supporting pages that strengthen your commercial rankings

Blog content should not exist in isolation. In a strong Shopify SEO strategy, content is designed to support your money pages. That might mean buying guides, comparisons, care guides, sizing explainers, gift guides, and editorial pieces that build topical authority.

For an eCommerce site aiming to rank for “women’s shoes” the same concept applies: your core service page is supported by content that demonstrates depth. Articles about different styles, new collections, sizing, shoe care, and brands strengthen the perception of expertise. Internally linking those pages in a logical way helps search engines and users see the whole picture.

Measurement: SEO is simpler when you define success properly

In eCommerce, rankings are not the only metric. You care about qualified traffic, assisted conversions, and whether visibility is growing for terms that make business sense. We measure SEO success in a way that ties back to revenue. That means tracking category visibility, organic performance on top products, the health of content clusters, and how well the site converts organic visitors.

Shopify SEO essentials

Shopify SEO in 2025 is not about chasing loopholes. It is about building a site that is structurally clear, fast, content-rich in the right places, and disciplined about technical hygiene. When the theme and content model support that clarity, your store becomes easier to crawl, easier to interpret, and easier to rank.

If you want SEO to be a growth channel rather than a slow trickle, the playbook is straightforward: sort the structure, build pages that deserve to rank, support them with expert content, and maintain discipline as the store evolves. That is what high-performing Shopify SEO looks like.

If your site isn’t performing, get in touch with us and we can help.

It’s been a while since the last news on a Google algorithm change, and it looks like the dust is now settled from the mid-March Core algorithm update, although some of our sites have only started to level off this week.

The results have been less dramatic than the back-to-back updates in December, although some sites that had made significant and continuing gains since that update rolled out have seen the results diminish somewhat.

General consensus is that this update has affected sites that use forum-based content, which are mostly going to be forums (but not Reddit, which we’ll discuss later) or that have engaged in heavy usage of AI-based copy to rapidly scale their content strategy. The headline sector to be affected is retail, which is unwelcome news given all the other pressures currently being exerted on retailers.

What can you do if a Google Core update affects your rankings?

The official guidance from Google is always to do nothing, as the altered performance for your site will be in line with the, if we’re being honest, shifted goalposts that the update represents.

This is problematic for two reasons. The first is that businesses that have been experiencing a certain level of trade linked to higher places in the rankings will now see that decline or even stop altogether. The second is more a point of pride than anything else, but some of the sites that are now top ranked are not as good as the sites they’ve replaced, which does tend to raise questions about where the benefit to the user is.

When looking at your recovery, the most helpful thing to start with is a list of the keywords you’ve gained and lost, and improved and declined. Not all are created equal and you might find that although the numbers are lower, the traffic and the intent for those you’ve dropped means you aren’t any worse off.

If you find that this best case scenario isn’t representative of your experience, it’s time to grab that list of keywords and work out where your targets are.

Once you know where you need to improve, you can then start putting a plan together. Have the pages that ranked for those keywords had a lot of love put into them, or have they been static for long enough that it’s time for an overhaul? We know from Google’s algorithm leak last year that they keep track of page updates over a span of time, so it’s a sensible place to start.

If the page has been receiving frequent updates (and consider if they’re too frequent, so might have negatively affected the quality), look at what you can do around the page to support it with a “pillar-based” SEO strategy, where it becomes the central repository for targeted pages that support your keyword targets. If that doesn’t help, what we’re going to discuss next might be of interest.

Take a look at Reddit

Reddit has flown under the radar for certain generations, and in a post AI-overview world that should change.

Not only are brands making a space for themselves by setting up accounts and discussing what it is they do with already engaged audiences (it’s important to declare your affiliation to the brand, but that doesn’t have to be a negative), but Reddit has already been and will, with the provision of the new license they’re offering, continue to be a training ground for Generative AI.

Google’s AI overviews are the next big thing

Google’s AI overviews are now appearing for more and more searches, and offer both a source of frustration and a lifeline to brands. Recent changes mean they could pull the information from any site, not just sites already ranking in the top ten, which is an invitation for brands to leapfrog huge industry players and end up at the top of page one.

The science behind making this happen is still in its infancy, but as we know one of the places that the GPTs are being trained, it makes sense to establish a foothold on that platform.

Google’s becoming the snake that eats its own tail

A slightly less welcome development this month has been the inclusion of Google search results in AI overviews. This means the chances of a zero-click search have risen (this is a search where the answer is found without ever needing to land on your website), and traffic overall has fallen as a result of people not needing to leave Google to get what they want.

This will be especially unwelcome for businesses that rely on this traffic for the occasional impulse purchase or even just to place their brand in the mind of users landing on a popular blog.

Expert help with your search results

If you’d like a chat about your site’s performance, or need an accurate picture so you can start keyword planning, get in touch. There are no easy answers or fixes when it comes to organic search, but that doesn’t mean we can’t help.

On 12 December last year, Google started the rollout of its latest Core algorithm update, which finished on 18 December and was immediately followed by an update to Google’s Spam detection algorithm on 19 December. The core algorithm update caused noticeable shifts in rankings across various industries, but many website owners are reporting seismic changes from the spam update.

 

What Are Google Core Algorithm Updates?

Google’s core algorithm updates are broad changes designed to improve the relevance and quality of search results. Unlike specific updates targeting particular issues—like the Helpful Content Update or the Spam Update—core updates refine how Google assesses and ranks content overall. These changes can influence visibility in search results significantly, as it essentially moves the goalposts for judging successful content. Sometimes they move more than others, which is why it’s entirely possible that your site may have been through previous updates without being affected.

 

Who has been negatively affected by the December 2024 algorithm update?

Reports from the Core update indicate these kinds of site have been affected:

  1. E-commerce Websites: Some online retailers have reported a decline in search visibility. This may reflect Google’s increased focus on distinguishing high-quality, authoritative product pages from thin or repetitive content. Sites with poorly optimized product descriptions or lacking user reviews may be particularly vulnerable.
  2. Informational Websites: Informational content publishers, especially smaller niche sites, have faced challenges. Larger generalist platforms with diverse, authoritative content have seen gains at their expense. This shift suggests Google is prioritizing sites that offer comprehensive, multi-faceted coverage over narrowly focused expertise.
  3. Affiliate and Ad-Heavy Sites: Websites relying heavily on affiliate links or ads without providing unique value to users appear to have been negatively impacted. Google’s guidelines emphasize content that serves user needs rather than solely commercial interests.

For the Spam update, we know that it was a general update rather than being aimed at a particular kind of spam, but we don’t know if the criteria have changed. There’s been no change to Google’s list of the kind of content and behaviour to avoid, but many affected sites are adamant that they aren’t using spam-adjacent tactics, which would at least have gone some way to explain things for the sites where traffic has almost completely disappeared post-update.

 

Why does Google update their Core algorithm?

As with all tech, changes outside the Google ecosystem will mean there is searchable content which they need updates to correctly interpret. As a very top-level overview, they will be looking for:

  • Quality Over Quantity: Thin or duplicate content has become a target, with Google rewarding sites that offer depth and originality.
  • Authority and Expertise: Sites demonstrating experience, expertise, authoritativeness, and trustworthiness (E-E-A-T, Google’s own recipe for search success) continue to perform better.
  • User Experience: Pages with excessive ads, poor mobile usability, or slow load times may face penalties.

 

Google’s Spam criteria

If you want to read Google’s full policies you can click here. If you’d rather read my interpretation, which is admittedly much shorter, here it is:

  • Cloaking – showing a different page to search engines than the one seen by website visitors
  • Doorway abuse – pages that just exist to send people to other pages on the website
  • Expired domain abuse – buying the expired address for a legitimate website to send people to a junk sales site
  • Hacked content – having pages on your site that have been hacked and not repaired
  • Hidden links and link abuse – hiding links on a page that are just there to affect search results
  • Keyword stuffing – too many of the same keywords in your site content
  • Link spam – buying links from an online service. These are usually spammy sites so easy for Google to recognise.
  • Machine generated traffic – traffic from bots, which are programs created to pretend to be people
  • Malware – linking to malware downloads
  • Misleading functionality – telling people you’ve got a feature, but the feature doesn’t work
  • Scaled content abuse – adding loads of pages which have AI-generated content and don’t actually help your users
  • Scraping – copying content from elsewhere
  • Sneaky redirects – using something called a redirect to send people from Page A to Page B automatically
  • Site reputation abuse – pretending to be someone else
  • Thin affiliation – publishing content from a shop site so it looks like you are the shop owner
  • User-generated spam – having rubbish in reviews or forum comments, (you’ll often see this promoting pharmaceuticals)

 

Can your website recover from an algorithm update?

Yes, but it might take some hard work and an undefined amount of time. Sometimes a subsequent update will undo the damage, but generally speaking if you’ve come a cropper, there’s something about the site you could be doing better, and that means identifying it, actioning improvements, and continuing to build back using the age-old technique of creating high-quality, relevant content.

Our guide to Search Engine Optimisation basics will get you started, but we’re also here to help in a hands-on way – there’s definitely correlation between maintaining, or even improving your search rankings, and having a well-optimised site. Drop us a line if you want to discuss the options.

For years now, the future of marketing has been seen as video content. Generally speaking, it connects differently with audiences than text content, because it’s quicker to digest and a more engaging medium. It’s also versatile, because a longer video can be recycled into ads for use on social media.

The downside to that is the perceived difficulty and expense of generating video content.

How can I make video content on a budget?

First, let’s not get carried away with all this video freedom. Good video content takes planning, and if we skip that phase the end result is going to be messy.

Set out the objective of the video you want to make, and the details of how you’re going to achieve that. Do you want to sell a product, are you trying to pass on some knowledge about your company, or is it something else? These questions will help you plan it out:

  1. Is that the right format for the job?
  2. What are we making a video to achieve?
  3. Does it need to feature real people or a voiceover, or can it be done with text?
  4. Does it need any graphic design work done ahead of time?
  5. How long does it need to be?

Generally speaking, I divide my video content into three categories:

  • Video that doesn’t need a human involved
  • Video that needs part of a human involved (usually their voice)
  • Video that needs to be all people in the middle, with an intro and outro to make it look professional

Category one needs me to plan out the story of the video, but the information will be conveyed with short bits of text and using images so it doesn’t need as much prep as categories two and three.

If there’s a human being involved, though, it’s important to write a script out before you start. This will keep your message on target, and also keep you from ending up with lots of footage that you need to edit.

Recording footage on a budget

Part 1 – recording people

If your video needs a real person, this section is for you. If it doesn’t, you can skip to the next section.

Your secret weapon here is that in 2025 we all carry superb quality cameras around with us every day. The key to making the footage they record look professional is in the background, and the lighting.

All smartphones have a camera, and most are capable of 4k video – this is what you’ll need if you’re recording just one person doing a piece to camera.

If you’re recording an interview, or you want to feature multiple people, a video call might be a better option. Zoom has a feature which will now record video and audio at each end of the call, which can then be combined into a stutter-free video which you can edit in one of the tools below. Microsoft Teams is also capable of recording calls, although maybe without quite such a smooth result, it’s still pretty good.

If you’re recording a video call, encourage your presenters to wear headsets, or worst case scenario, modern headphones that offer hands-free will do. That’ll cut down on any background noise. Ask them to present in front of a tidy background – it doesn’t need to be a liminal space drifting through a white void, but it shouldn’t look like the inside of a skip, either. Video backgrounds have come on a way but still look a bit too fake to use unless there’s an emergency.

If you’re just recording a voiceover, there are a few ways to do it. Windows has Sound Recorder, which is ancient but still does the job. If you need to edit the file, Audacity is a free bit of software that will let you chop it up and rearrange it, then output as an audio file ready to use.

Editing Teams/Zoom footage

This is where the time spent planning out the script is important. With the software you’ll be using, you can make some edits, but you won’t want to have lots to make – it’s fiddly and time-consuming. If you’re going to need to make lots of edits, I would recommend you get a copy of Power Director, which is easy to use and comes in either subscription or one-off cost form. In the long-run that will save you time and money later vs trying to edit using an unsuitable platform, so it’s worth the small investment.

Part 2 – Adding the professional touch

OK, so if you need it you should now have your audio and/or footage of people ready to go. To create your video, you’ll need one of these:

Dedicated video tools:

  • Lumen5
  • Biteable
  • InVideo

Tools that work with video, but not as their specialism:

  • Adobe Express
  • Canva

All these platforms have a drag and drop video editor that lets you place either your own clips or the stock footage which the platforms contain anywhere you want in the video timeline.

You can add multiple layers, allowing you to have text appear over the video, and also to create intro and outro sections using your branding. You can also apply animation effects to the text or other elements, which can let you put together some slick intros, outros and transitions. Speaking of transitions, you can also change the way that the video moves from clip to clip – sadly, starwipe isn’t an option any of these platforms offer.

They also have a bank of royalty-free music which can be used to score your creations. Some have a dedicated feature to let you add a voiceover, but if not you can always add it in as a music track.

Of all the platforms above, Biteable is the only one to offer not just stock footage but also reusable animated sections. These can be more helpful than stock when you need to present information, although you can still put decent videos together with anything on the list above – it just takes a bit more ingenuity.

Pro tips for new video editors

Don’t have lots of dialogue if there’s text to read on the screen, and vice versa – your audience won’t be able to concentrate on both.

If there is text on screen, check the timings with someone else before processing your video. Because you know what the text says, it’s easy to underestimate how long it will take someone to read for the first time. Equally, you don’t want the text on screen for so long that your viewers get bored.

Don’t get too carried away with transitions – keep it simple, like a slide or a fade, instead of adding more visual distractions.

The information above is all you need to get started, but if you need more help, there are lots of dedicated guides on YouTube – now get out there, and start putting your own video content together!

Note for anyone that buys PowerDirector

So, the irony here is that PowerDirector is more capable of editing than any of the platforms discussed above, so while you can use the method I’ve suggested, you may find it easier to compile the clips you want from a platform, then export them and cut the whole thing together in PowerDirector. It might be worth trying both ways so you can find the one that works best for you.

If the current understanding of how Google ranks websites is right, every site is like a leaky bucket.

It holds your website authority, but when you link to other sites, each link creates a hole through which a little bit of your site authority leaks into their bucket, and so on. Your bucket is simultaneously filling with authority from other sites, but what can you do if, to take this tortured analogy all the way, the water is polluted?

What are toxic backlinks?

We know from this year’s Google algorithm leak that they assign every site a score based on aggregate factors including technical health, user behaviour and backlink quality. While this was pretty much the way the industry had assumed it worked for years, the confirmation was still nice.

A link from a well-maintained and updated site would be a positive, endorsing your site’s status as an authoritative destination for information about whatever it is you do. On the flipside of this, what if you’ve got links from sites that aren’t being maintained? Maybe they don’t have a security certificate (giving them the https address, a must for Google), or even worse, they might be previously legitimate websites that have since been hacked.

They don’t represent a positive trust factor for your site, in fact it’s the opposite. You might also have no idea they’re there, unless you invest in a dedicated tool like AHRefs or SEMRush.

Why do toxic backlinks happen?

This is where we don’t have all the answers, or at least not entirely. There are old sites, and there are sites that linked to you before they were hacked, but there are also sites that get hacked and then add a link to your completely above board business.

There are a few suggested reasons for this, but in the end we can only speculate. This morning I checked on the website of a client and found that they have three new toxic backlinks from different sites, all three of which share the exact same text in the link and on the page itself – at least from what I could tell, because all three sites flagged as dangerous to Google, which refused to let me visit them.

The fact that the content is duplicated means this is most likely being done by bots, not humans. They might be building the quality profile of a site which criminals want to prove is genuine so they can use it to commit ad fraud, or they might be bad actors working for an enemy state which is trying to do something more insidious.

Scary stuff. But can you fix it?

How to get rid of toxic backlinks

You’ve invested in an SEO tool, and found you’ve got a ton of links that you don’t want that are probably bringing your site authority down. You know what their URLs are and can see the meta description for each page via your SEO tool, which will give you an idea of whether it’s legit or not. You’ve now got two options.

  1. You can contact the site owner and ask them to remove the link. This bit of advice is still being given out so I’m mostly including it for completeness, and so we can have a bit of a laugh about how you do that when your antivirus software won’t let you visit the site.
  2. Hopefully you’ve followed the advice in our previous article and set up Google Search Console. If you haven’t, you’ll need to do that first, then come back.

Right, done? Now you can use Google’s Disavow tool, which is here: https://search.google.com/search-console/disavow-links

How the Disavow tool works

You’ll need a plain text file which lists all the backlinks you want to disavow (basically, telling Google not to count the link to you when it’s assessing your site). Most of the SEO tools will do this for you, and SEMRush has a particularly helpful setup. You just upload the list via the link above, and hey presto! Google will, sooner or later, accept the list and apply it when indexing your site. The time quoted on submission is quite long, I’ve found it is usually much quicker.

Does removing toxic backlinks have an effect?

Toxic backlink theory has taken a bit of a hit because Google experts have repeatedly said that they don’t affect your site, but it’s good to remember that they are the same experts who said Google doesn’t apply an aggregate quality score to your site.

We’ve had success for a luxury menswear client who had picked up a slew of bad backlinks following a site migration. Prior to submitting an updated disavow list for them, some of their major keywords were fluctuating in position, but we’ve seen that settle down, allowing us to focus on achieving organic growth.

We can’t be certain of the impact, because in Search Optimisation we never can, but in legal terms, the balance of probabilities says it had a positive effect.

Backlink disavowal basics

The most important thing is to be sure about the sites you’re disavowing. Look at the clues:

  • Does the meta description consist of random words?
  • Does it seem like the site should be linking to you?
  • Is the site domain from a country that you have no link with, or are suspicious of?

Basically, if you sell luxury lamps in Luton, would you expect to have a link from the website of an Australian plumber?

If the answer is “no”, then disavow that link. If the answer is “yes”, drop me a line, because that sounds like a story worth hearing.

Managing your own website, whether it’s eCommerce or not, is a tough job. If you’re doing it on a tight budget, and that probably describes most of the website-owning population in 2024, it’s even tougher. Successful websites thrive on continuous improvement, and that’s only possible if you have the data you need to make decisions.

Luckily there are still some free options to help you start building that data picture, and in this article we’ll look at ten of the best, plus a bonus one that’s very nearly free. We don’t have ties to any of these products, but if they want to send over free t-shirts as a result of this article, it’s a size medium please.

  1. Google Analytics
  2. Search Console
  3. Pagespeed Insights
  4. Answer the Public
  5. SEMRush
  6. AHRefs
  7. Google Trends
  8. Looker Studio
  9. Hubspot
  10. Hotjar

*Almost free –  Keywords Everywhere ($27 a year)*

Google Analytics

The equivalent of a National Insurance number – everybody with a website needs this. Like most free tools there’s a steep learning curve which became a bit more like an unclimbable wall with the release of Google Analytics 4, but there are lots of guides out there which can help you get to grips on what it’s telling you, including  free courses from Google itself. We can also help you to set it up and understand the data it’s reporting, if you prefer the human touch.

There are customisable goals (now called Key Events) which you can implement to report on the parts of your site that really matter, like whether people are taking actions like submitting contact forms. You can then use this to assess whether your marketing activity is working.

Search Console

It’s easy to mistake Search Console, which is another Google tool, for an older-looking version of Google Analytics with less features, but it’s there to do an entirely different job. Actually, it’s there to do three entirely different jobs.

Search Console’s able to tell you the queries people used to find your website, which is actually telling you the keywords that your site performs best on. It can tell you whereabouts in the search results you are, which you can use to guide your site optimisation and other advertising. If you’re not performing well for a keyword you need to rank for, some of the other tools in this article can help you find out how to improve.

It can also tell you about the health of your site – are all the right pages in Google’s Search Index? (if you run an E-commerce site, you might not want every variation of a product to be in there, for example – just the main product page). It will also warn you if pages become unavailable, which can be helpful if you’ve got a site outage you don’t know about.

Lastly, it will offers basic reporting on the quality of your website experience. Google’s assessment of your site looks at not just the content, but the overall performance. You can get a more detailed picture from Pagespeed Insights, another tool we’ll discuss later, but for a quick assessment, Search Console is an option.

There are also a few smaller services that are easy to overlook. Most websites will hopefully never have to suffer the effects of a Google Manual Action (a penalty from Google for doing something they see as trying to cheat their systems), but if you do, it can be appealed from within Search Console.

There’s also a link section that looks at the backlinks coming into your site – want to know which are your top linked pages, and which sites link to you the most? You can find out here.

Pagespeed Insights

If you need to know technical information about the performance of your pages, this is where to find out. There are quite a few different metrics you’ll be judged by, including the speed of your site, how much the page layout shifts around when it’s loading, and how quickly it becomes interactive.

The downside is that while it gives recommendations, most of them can only be solved with the help of an agency that has experienced developers. The upside is that we are one of those agencies, so if your site isn’t at its best, we can help you fix it.

Answer the Public

Knowing which questions your website has to answer to rank well on Google can be huge in terms of performance. Answer the Public is a tool which has been through a few different iterations (and owners), but the basic functionality remains pretty much the same. You can put in your search term and it will find you related searches on the same subject, including the questions users are asking. Creating content around the suggestions it gives will increase your authority on the subject, and your chances of the users you want landing on your site.

SEMRush

The paid version of SEMRush offers a wide selection of tools to help with SEO (Search Engine Optimisation), but there are still some you can use on the free plan to build up your site. There’s even limited keyword tracking, which means you find out how well your site is ranking for important search terms and see if it moves up or down the ranks with daily update emails.

AHRefs

Similar to SEMRush but taking a different approach to the free account is AHRefs, which will let you do limited analysis on sites which it validates through your Search Console account. Both SEMRush and AHRefs use their own proprietary metrics to calculate your site’s strength, so if you try both, prepare to see some variation in the reporting. AHRefs will send you some very useful tutorial videos, even if you are on the free plan, so may actually be the best one to try first.

Google Trends

Google Trends is similar to Answer the Public in that it will show you search trends and related searches, but the information it produces is a lot more limited. On the plus side, it’s fully free to use, so you can always use it as a fallback when you run out of free Answer the Public searches for the month.

Looker Studio

This one is filling more of a niche role. If you’ve got people you need to report to about your site’s progress, you can connect Google products like Analytics and Ads to Looker Studio and built stylish reports which can be filtered to show performance over custom periods. If you’ve got the patience, you can also add statistics from any other platform that will output them as a spreadsheet but setting up a Google Sheet to hold the data and adding it as a data source for your Looker Studio report.

Hubspot

If you’re just starting out then you might not have a solution set up to capture the data about the work that arrives via your website. The free version of Hubspot offers forms that can be embedded into your site, which can be set to email to the inbox of your choice. More importantly, you can keep the data inside Hubspot and use the free features to track the progress all the way from a lead to a customer. If the time comes when you realise you need more features, there’s also a competitively-priced Starter tier that will add a ton of functionality.

Hotjar

If you’re seeing lots of visitors but no enquiries, or just really want to know what your visitors do when they’re on the site, Hotjar is the answer to your problem. It can show you how far down your pages people scroll, where they like to click the most, and lots more. It’s also just been merged with two other firms that offer similar services, so now you get even more of an allowance with the free tier.

Keywords Everywhere

This tool used to have a free version, so I know how useful it can be. Unfortunately you now have to pay for even the lowest tier, but the price is $27 for an entire year. It’s a plugin for Google’s Chrome browser that offers some really helpful features. It works a bit like Answer the Public, but instead of a dedicated site, if you search for a topic on Google, you’ll also see related searches shown to the right of your search results. It can also assess website pages for keyword density, which is less useful now that AI is making a lot of the calls when ranking a web page, but there was a time when seeing that your biggest rival had mentioned “sandwich toasters” 35 times on their sandwich toaster page which was outranking yours was really helpful, because you could then mention it 37 times (once extra for insurance) and hope to outrank them.

What to do if the tools tell you to do complex work

Whether they’re free or paid, there’s only so far that website tools can get you. If you’re seeing results you can’t explain, or your site is failing to grow no matter what you do, we’re here to help. Give us  a call or drop us a line and let’s chat about the options.

Do you know how well your eCommerce site ranks on Google? Ever wondered what your competition are doing to outrank you?

In this article I’ll look at why you should improve your organic search performance, and some tricks that can help you do it.

Why worry about organic search performance?

The truth is that for most sites, strong search performance and a high-quality user experience are the same thing. Your users want to land on a page that’s easy to use, well-written and helps them to do whatever they want to do. Google’s aim is to send them to the website best able to give them what they want. Your aim is to be the site that Google prefers to send those users to, so they can buy your product.

How Google assesses your website

This breaks down to four key points:

  • Your site’s technical performance
  • How your users behave
  • Correct use of structure and high-quality backlinks
  • Well-written content

How to look at your website’s technical performance (for free)

Getting insight into your website’s performance often comes at a premium, so the good news here is that Google offers a free way to check on technical SEO essentials like page loading speed and cumulative layout shifts (CLS). If you’ve ever noticed the text that’s in front of you being pushed down the page as a header image loads, that’s an example of CLS.

The bad news is that fixing the issues it flags usually requires having a good developer on tap. You probably know what’s coming next – we can help with that.

Here’s the link to the Google PageSpeed Insights tool: https://pagespeed.web.dev/

Finding out how your users use your website

There are multiple layers to this, and again, you can get some of the information you need for free, provided you have Google Analytics setup on your site. If you don’t already have it, put it at the top of your to-do list – it offers essential insight, and the only cost would be if you need a developer to implement it. Google offers lots of guidance on how to do it, although as it increases in complexity as a tool it has become a lot more technically demanding.

Google Analytics offers tons of information about how your users are behaving on your site, and can help you understand how long they stay on your site, and how many pages they look at while they’re there. It also offers some basic tracking of scroll depth, which is how far they move down the page – if they never get past the bit they land on, don’t visit other pages and leave immediately, that’s all going to be used against your site when Google decides how it should rank.

How should you structure a webpage?

While there’s not a precise format to building a good web page, there are definitely some best practices to follow. Some of them might not seem linked to SEO, but it’s really all self-supporting – do a great job of selling your page and you’ll get more visitors as a result, which then improves the page’s rank.

Firstly, use the Meta Description to sell your page. The Meta Description is the little precis that appears next to your page name in Google Search, like this:

All websites will allow you to write your own for your pages, although the method will be different for each.

Next, make use of the different header options. Most sites will offer Headers 1-5 as standard (usually referred to as H#, e.g. H1), some may have more. These offer two benefits. They tell Google’s search robots how important the different headlines are when deciding to rank your page. H1 means it’s the most important header on the page, and all other headers should be considered as less important, but still more important than the copy that makes up the rest of the page.

You don’t have to use all of them, and the ones you don’t use don’t have to be in order – just remember that the highest number will be the most important.

They also act as a visual cue for your visitors – the biggest headline is the most important, but you’re drawing attention to all the important topics in your article.

Why do you need backlinks?

The rule for backlinks is that higher profile (reputable) sites have better quality links – the fact that they’ve linked to you is seen as a kind of endorsement by Google, and as a result you get improved ranking for your own pages. This works in reverse, too – lots of links from low-quality sites will make your site’s performance worse.

For most sites, it’s a balancing act. There are steps you can take to stop the bad links from being counted, but building your collection of good links can be a challenge. Have a think about any connections you might have that can help, which might be clients, industry connections or accrediting bodies.

The secret to well-written content

This one is much tougher to quantify, but sits across all the other factors. A well-written site will be more appealing to customers and Google’s search robots. It also increases your chances of taking one of the search features that will give your organic search performance a boost by featuring some of your content at the top of the page.

Copywriting is definitely more art than science, so if it’s not your forte then it might be worth looking at outsourcing it. There are also tools that will help with the basics, like Grammarly – it won’t write the copy for you, but it will help make it more readable.

The weapon of last resort here is of course GPT. Catapulting from being virtually unheard of to being the subject of at least 50% of the articles in my news feed at any one time, used the right way, GPT can be incredibly helpful with content creation. The vital thing is to remember that it’s only part of the puzzle – it’s not taken long for people to recognise the distinctive style of content written by GPT, so it should only be part of the process. Use it for research and content ideas, but not to produce anything that’s used without serious finessing from you or your team.

Ask for help if you need it

It’s easy to fall into the trap of trying to do everything yourself, but the fact is, agencies like ours exist because this work is difficult, technically finicky and really intensive. If you’re hitting brick walls everywhere you turn, give us a ring or drop us an email. We can quote for any or all of the work you’re struggling with, and in the long term finding a quicker resolution could well end up saving you money.

Get in touch

If your website isn’t delivering the results you were hoping for or expecting, it’s probably time to look at your search optimisation, sometimes also called your organic search performance (so called because it happens “naturally”, as opposed to paid search results that have a direct cost). In this Blog, we’ll look at what search engine optimisation is, what it does, and the foundational approaches to carrying it out.

What is Search Engine Optimisation (SEO)?

To really understand what SEO is, it’s helpful to know a little bit about how search engines worked before Google, and how they (Google) work now.

Back in the old days, when dinosaurs roamed the web, search engines were dumb. They didn’t know anything about you other than the words you were typing into your search bar, and they used a basic matching system to deliver you search results linked to the words. If you looked for “computer mouse”, you’d get some results about computer mice, some results about mice that eat cheese, and some results about computers but no mention of mice. You could find what you wanted with a bit of work, but the internet was a smaller place generally so there wasn’t a ton of results to wade through for most topics, and it was possible to get your page to the top of a search engine results page (SERP) using tactics as ingenious as including the word you were targeting as many times as possible in the page.

That all changed when Google came along. Google’s search results are affected by so many different factors that it’s not practical to list them all here, but here is a very rough explanation using some of the factors:

  • Google’s search robots (bots) analyse (or crawl) every single page on the internet that they can reach
  • The data collected by Google’s bots is used to assign every website a “quality” score based on Google’s own criteria
  • The technical standard of each website is assessed, including how long a page takes to load
  • The bots will also attempt to understand what the page is about in order to decide where it should appear in your search results
  • Google hold data on everyone who has a Google account (in 2023 there were 1.8 billion users of Gmail, Google’s email software, worldwide)
  • By combining their data on the page and their data on you and your behaviour, Google will then show you the search result it thinks best matches what you were looking for

As I say, this isn’t an exact explanation, but if we apply it to the same search as above, you can see the difference in the results.

If you look for “computer mouse”, Google knows that kind of mouse isn’t related to the animal, because it understands how the words work together to create context. It also knows that when you search for those words you’re looking for the answer to a question, and it wants to guess what your question is and answer you in the most efficient way it can. Your search might include a map which shows the nearest places to your location that sells computer mice, especially if you’re using your mobile phone to search. You might be preferentially served results for a site that sells computer equipment that Google knows you’ve previously visited. You’ll almost certainly get results that are adverts for places near you or online that sell computer equipment. Those are a topic for another day – we’re looking for the results that appear below those, which could include local search results, and features that highlight content from particular sites which answer the question Google thinks you’re asking.

And that’s where SEO comes in. It’s the art of structuring your website, including everything that’s written on it, to have Google understand what it is you do and how you can help its users, so that it can then recommend you to them ahead of your competitors.

How do we know what Google is looking for when ranking a webpage?

Because Google has done such a good job of keeping the algorithm that powers their search a secret, traditionally SEO has been based on observations of the way that pages perform.

That all changed earlier this year, when for the first time, Google’s internal documents were leaked that discussed some of the criteria that decide on the order of their search results.

That’s how we know that every site is given a quality score by Google, which was always suspected by industry experts, but never confirmed. We also know some of the other things that Google’s bots prefer to find when crawling a page.

Google’s stated preference for a site is that it demonstrates the qualities Experience, Expertise, Authoritativeness and Trustworthiness (Which they call E-E-A-T).

Experience means they want you to show that you’ve already done the thing you’re telling people you can do. Expertise is showing that you can do it well, and have good subject knowledge. Authoritativeness would ideally be demonstrated by having good quality links to your site from other sites in the same industry, or high-profile news sites like the BBC or The Guardian. Trustworthiness would look for reviews from Google itself, a third-party review site like Trustpilot, or hosted on your own site. On a more fundamental level, this one also wants your site to have secure hosting that protects the data of your customers.

So, what should you aim for?

  • Informative, well-written content. This covers every single letter in the acronym, as it demonstrates why your goods or services are better than the competition (EE), the better your content the more sites will want to link to it (A), and if you’re hosting your own reviews from customers, you’ll want to work with them to make sure that they’re compelling (T).
  • Content that’s been structured in a way that makes it easy for both people and Google’s search bots to read and understand it. Luckily, these two aims aren’t mutually exclusive.
  • High-quality backlinks. Not much more to add here beyond what’s discussed above, but we’ll discuss the impact of low-quality backlinks later.

Creating quality content

One of the toughest parts of working on search optimisation is that it needs a combination of analytic and creative skills to address poor performance. It’s really easy to say that the solution is quality content, but what that looks like and how you produce it will be unique to everyone. To make it slightly easier, you can use the PARTY acronym:

Plan – Plan your content, not just by laying out the structure of your article, but by using tools like Answerthepublic, Google Trends, and the search suggestions in the Google search bar to look at what people are searching for, so you can answer their questions as part of your writing. It’s also helpful to produce a content plan which covers a longer period. This can be especially useful if you’re able to share the writing responsibilities, as it makes sure you’re all aiming at the same target.

Apply – One of the least sexy bits of search optimisation, and it’s not the sexiest job at the best of times, is the application of the fundamental aspects of optimisation. It’s discussed more below.

Research – As both a user and a search bot, there’s no worse kind of page to land on than one that’s thin on content. If you’re doing some writing and it’s not one of your specialist subjects, you need to do enough research to be informative, and if possible, find something new to add to the conversation.

Tell – Both the public and Google love a lived experience. If you’ve got personal anecdotes which relate to your topic, find a way to work them in there. If you have a house style for writing that doesn’t require you to be formal (which most retail businesses are unlikely to), adding some personality is a great way to make a connection with your reader.

Yield – Maybe this one should have been first, but no one is going to remember an acronym that spells out Ypart. What’s your article intended to do for your site? Do you want to promote a range of products, or demonstrate expertise in a particular area? This gives you the topic for the article, and that lets you know what to aim for with everything else you do.

Using the proper structure

This should go hand-in-hand with creating quality content, so much so that I’ve included it in the PARTY acronym. The key elements are:

Page title – the first thing people and bots see, and the keystone your search is built on. It can often be a good idea to make this a question that you answer in the article, to fit with the questions users are trying to answer via Google.

Headers – structure your paragraphs with compelling headers. Don’t be afraid to use questions here too, if they’re relevant to the topic. Headers use a simple bit of code (H1, H2, H3, etc) to tell Google the order of importance attached to your article headings. It also makes them larger than standard text (paragraph), which is important for your customers too.

Meta description – the precis that Google shows of your page’s topic. This is often a missed opportunity, don’t waste the chance to write something that makes your potential visitors want to learn more. Make the best use of the character limit (although the Google leak has shown that this isn’t all that important, it’s still good practice for the users who’ll be reading just the part that fits).

Build your backlinks

Strengthening your site’s search optimisation can sometimes be a case of who you know, rather than what you know, but with a bit of ingenuity you can create connections that will increase your search ranking and hopefully bring in a few more visitors directly.

High profile sites (news sites like the BBC, expert sites in your field of work, and some educational sites all have good quality authority that will flow through links into your site. Have a think about any connections you might have that could be interested in linking to you. If you don’t have any, that’s fine too, but it might take a bit more legwork to get a decent amount of links sorted out. If you’ve got something newsworthy, you should always include a link to your site from any press release you send out. It often won’t make it to the final draft, but if it does you’ll be getting even more use out of the story.

A quick word on toxic backlinks

So called “toxic” backlinks are ones that have the opposite effect on your site’s ranking. They might be from an attempt to boost a site’s profile in the early days of Google, when the number of links was more important than the quality, or they could have accumulated for an unknown reason. Whichever it is, there is a process to disavow the link via a Google tool if you aren’t in a position to have it removed from the other site (and many will not respond to contact emails), which can go a little way towards alleviating the problem.

Conclusion

This has been a very high-level look at the basics of Search Engine Optimisation. It can be intimidating to tackle without the right tools to measure success and suggest improvements, which is why we’re always ready to help if you need us. We’ve helped companies from a wide variety of industries to improve their site’s visibility, and would love to help you do the same.

Get in touch