News, Tweets & Thoughts
Blog | SEO | Top Tips

It’s Time To Get Back to Basics

Digital marketing – as with almost any other industry – undergoes cycles of change and we’re often overwhelmed with information (from so called gurus) about the latest trends that will supercharge our websites to receive loads of new customers!
Just type “content marketing’ into google and you’ll be inundated with website after website extolling the virtues of  a content driven strategy across all channels.  You’ll probably even be presented with case studies describing how ‘magnificent web agency X’ used mobile to flip the fortunes of their struggling clients.
Now, don’t get me wrong,  I’m not for minute suggesting that content is not important, or that mobile is over-rated, no, because at Devstars we’ve been using all of this and more to build great websites for years!
What we DO believe is, that there’s a need to get the basics in place first.  After all, “it’s not the beauty of the construction you should look at; it’s the construction of the foundation that will stand the test of time”.
So, if your web design agency is not implementing the basics from day one, chances are you’ll be missing out on new customers via search engine traffic and you’ll begin to lose ground on your competitors.
Designing and building websites that are search engine compatible are what we do best and here are just a few of the basics that we consider in each new build.

Crawler access

If you’ve got a website with many pages or if you regularly add new content to your site, search engine crawlers may have a hard time crawling the pages and understanding which pages are priorities for indexation.
And obviously you want prospective users to find relevant pages when they search for you using search engines because if they land on pages that have nothing to do with their search they will go elsewhere.
It’s like going to your favourite department store and wondering around aimlessly because you can’t find what you’re looking for.   You’ll probably on walk around for a limited amount of time before you  get fed up and go elsewhere.
And that’s what happens when people can’t find what they’re looking for on your website
There are probably a number of pages that you don’t want the search engines to index, for example private areas of your site such as application forms and personal data.  Thankfully, you can block specific areas of your site by marking them as “disallowed”:

User-agent: *

Disallow: /cgi-bin/

Disallow: /folder

Disallow: /private.html

It is also possible to block crawlers by name so if you know that a specific ‘bot’ is visiting your website and you want to prevent its access, you can use the following (replacing ‘blockbot’ with the name of the bot you’re attempting to block):

User-agent: BlockBot

Disallow: /

But take care not block ALL crawlers that visit your site as it will become virtually inaccessible and traffic will drop.  Having a correctly formatted sitemap in place will greatly improve google crawler’s access to your web pages, so check that this is in place and fully functional.


And while we’re on the subject of sitemaps, have you uploaded your sitemap to Google Search Console?  If not login into Google Search Console and;

  • Go to ‘Crawl
  • Then ‘sitemaps’ on the left hand side
  • Click ‘Add/Test sitemap’

In doing so Google will test the sitemap for errors and ‘submit’ it for indexation (if all is well).  It is possible to check that individual pages are being indexed correctly using the ‘Fetch As Google’ function.  Here are the steps you need to take;

  • log in to Google Search Console
  • click “Crawl”
  • Select “Fetch as Google.”
  • Enter the URL path of the page you want to test
  • Click “Fetch.”
  • Check status. It should have a green check and say “Complete.”
  • Click “Request Indexing” if available.

Don’t underestimate the importance of taking these steps.  Not only for your own peace of mind but also for practical purposes; ie to make sure that the crucial pages of your website are indexed and accessible.

Site structure

Having a practical user-friendly website is at the heart of every Devstar design project.
We understand that mobile is an important ingredient of the user experience. Having said that a solid understanding of search engines and search engine optimisation, and navigation and site links, is a prerequisite for any web design agency.
You’d be forgiven to think that every design agency would take these factors into account when starting each web design project but it’s just not the case and might be because building a good site structure and putting content in a logical format takes time and planning.
Not every web designer has experience and capability to follow through with these core ideals which is potentially damaging to a website owner’s bottom line.
Why? because if search engines aren’t able to find relevant pages, then it’s likely that website users will also struggle to find what they’re looking for, and we all know what that means.

Meta Tags: Title and meta descriptions

Meta tags are still important, basic factors to help both search engines and potential customers to navigate your site find your site.  Ask your web designer what they are doing to ensure that the basic are in place ready for your site launch.  If the room is filled with an uncomfortable silence, think twice about using them!
Title tags in particular appear in Google search engine results pages so can indicate to prospective users what the content is all about.
It stands to reason that whoever is writing your website’s titles knows as much as possible about what the user is searching for.  A well-written title tag will also enable crawlers to understand what each individual page is about, so if you pay attention to optimising your title tags it is likely that you will yield more relevant traffic via the search engine results pages.  Consider these points when writing your title tags;

  • Length: Title tags should be a maximum of 70 characters long, including spaces.
  • Keyword Placement: important keywords need to present
  • Re-write Duplicate Title Tags: must be written differently for every page
  • Make It Relevant: must be written to describe the content on the page

Meta descriptions – although not Google ranking factors – are important because they appear in Google search engine results pages (SERPS). That’s why it’s important to write descriptions that is related to what users are searching for.  What’s more the particular keyword relating to the product or service that the user wants, will be written in bold in the blurb that appears in the search engine results pages.  Guess what the search string was from the example below.

So, the message here is, beware of false web designer prophets and ask all the right questions before engaging a new web design agency on your next project.