Technical Insight - What is SEO?

a book with the words what is seo? on the cover. Introduction
As creatives we have to advertise our skills and most of us have a website. We want it to attract visitors to look at our work, buy stuff, play stuff, listen to stuff and all of the other things people do when they arrive on a page. There are millions of web sites on the Internet and as we all know, we find them using a search engine which list sites matching the criteria we enter from where we make a choice about a particular site and go visit. In our article we take a look at how the search engines discover and analyse websites and see how SEO increases the chances of a visit by people browsing the web.

The Search Engine
Search engines (or SEs) like Google and Bing operate huge repositories of data about most of the websites on the net, which we users access when we’re searching for information. Offered up to us by the search engines are their minimal front-end pages where we type in our bits of data describing what we’re looking for. This triggers an instantaneous dive in to those data repositories to find the most appropriate websites and surprisingly quickly the results are delivered back to our screen where we can review the list of thousands of results and pick out the sites that appeal to us most.

As there are thousands of results in these lists, being on the first page of results is always going to lead to a far better chance of a visitor than if our pages are buried deep within the myriad of other lower results. It’s well-documented that appearing on that first page is the most desirable place to be. Even appearing on the second page greatly reduces the chance of a visit. So how are these lists of results compiled?

Search Engine Ranking
You’ve probably already gathered, the ranking is the most important part of those search engine results. If a website is on the first page of the first 10 to 20 results of the list or even better, in the top half of the page, visitors are far more likely to click on the link and visit. The order of the rank is carefully calculated using extensive computations in what is commonly called an algorithm.

The algorithm uses a range of data from the repository to analyse and determine where a site should appear and place those sites it considers to be the most appropriate at the start of the list. Of course how the algorithm works is always kept a secret, wouldn’t we all like to know just what rockets a site to the top of the list? We can but generalise about how it works but we have touched on perhaps the most important element, that being how ‘appropriate’ a site is to the potential visitor. Another way to look at this is how useful it’s deemed to be because the SEs want the visitor to get the most out of the sites they suggest you visit.

In the instant it takes the SE to identify and list the best sites for what we’re looking for, the algorithm has decided which are the most useful and ranks them with what it thinks is the best result at the hallowed number one spot. (For the purposes of this article we are going to ignore paid-for results where a site pays to appear as an Ad at the top of the page. Here we’re looking at only what are called organic results, found because of their natural ranking). What then is in the data used by the algorithm to decide the rank?

Finding The Algorithm Data
Before a site can appear in any SE results it must have been visited by what are known as ‘robots’. These are automated processes sent out on to the Internet by the SEs which trawl through the contents, design and structure of a website to extract information about it which is then compiled to be used for calculating its rank. The quality of this extracted data forms just a part of the rank criteria.

The robot will look at the structure of the website code, any hidden information included in the ‘metadata’ (See example 1) and the information on the page the user can see. Metadata (such as title and metaname) is text added to specific points in the HTML code of the page which describe its content. In the first years of the web metadata was a very important ranking component but as web designers have learnt about this and tried to fool the SEs with its content the SEs have placed less importance on what they find in metadata. Even so I will always place useful data at these points as it may still have value to the rank. Indeed some of this is used by the SEs to form part of the displayed result in their search results so it’s important to pay close attention to it. Afterall, your potential visitor will see it and could well make a decision to visit or overlook the site based on what they read there.

HTML code for the metadata The information presented on the page to the visitor is of great importance so it’s examined in some depth by the SE to determine if it’s genuinely useful. Well-written useful pages without repetition of keywords are seen in a positive light by the SEs. (An example of a keyword might be photography terms on a page about taking photographs, like aperture or shutter speed). Any attempt to fool the SE with lots of words which don’t mean anything is a big negative and will harm any site’s rank. Keywords which are useful to the information on the page is one of the elements the SE uses to determine when to pull this page back in to a result. The SEs also look at the text to see if it’s been duplicated from another site and if it’s found to be the case that’s another big negative. SEs like unique information.

Other data relevant to the site is also added in to the mix. For instance page headings such as the HTML H tags which refer to the content in a clear way. Inbound links from other sites that point to the website are a very important part of the algorithm because the SEs consider this a positive indicator from the web about the quality of the information on the page in question. However, these inbound links can’t come from anywhere. They have to be from recognised high quality sites. Long ago the SEs realised that some companies were using this feature to sell inbound links to try and fool the algorithms. Now poor quality inbound links may well damage a site’s reputation so it's probably wise to stay away from the temptation of trying to fool the SEs.

Images on a page are great for the viewer but they also contribute to the rank quality of the page. Using relevant names that refer to the page’s content and also placing meaningful content in the ALT part of the IMG HTML tag helps too. (see example 2). The ALT tag is used by web browsers to display the text when it can’t display an image. Perhaps the SEs see this as good design, again who knows for sure?

Even the website’s web address, or URL (Uniform Resource Locator) can play a part. At one time keywords in the URL were important. Whether they still are really is unknown.

And Here's The SEO Bit
As you can see there is a lot to consider when designing a web page and placing information in to it. The process of formatting and inserting that information in to a page that the SEs can usefully use to assist the rank is called SEO or Search Engine Optimization. Whole industries now exist with the sole purpose of working with SEO to maximise their client’s rank. It is though something of a black art where everyone tries to understand the algorithms and ranking systems of the SEs to ensure their pages offer the best SEO possible in the hope of climbing the rank ladder. Web designers are always going to consider the page layout to ensure it meets the maximum SEO possible but without going over the top. Trying to fool the SEs with dodgy optimisation techniques will harm that site’s rank and that is never good. It might not even be possible to claw it back so putting together and honest and useful page is more likely to achieve a better rank.

It’s all well and good having great technical design with good SEO built-in but that really is only part of the story. The content is, as they say, king. Great content that visitors find useful will contribute a huge amount to a good rank. The SEs love those inbound links because it demonstrates real people want to share that content. That’s a huge plus point. Effort then must be made to have genuinely interesting information.

example of html IMG and H tags Mobile is Important
Up until quite recently we’ve only looked at websites on our desktop computers. Now though we use tablets, phones and other mobile devices all of which offer a smaller viewing space for the information. The SEs realised this, especially Google and decided that viewers who use smaller devices should see search results of websites optimized for their mobiles and tablets. Any search that originated on a mobile would see mobile-ready sites ranked first. This trend for mobile use will only ever grow so now website owners must ensure their sites are mobile-ready and keep inline with the expectations of the SEs and users. The approach to creating a website that fits any size display is called Responsive Design. Using CSS properties (Wikipedia has more on CSS here) the site detects the viewing area of the browser and slides to fit its dimensions. This means the page will fit perfectly with a desktop display or a handheld mobile device. The CreativesGo site was designed with this approach so why not try resizing your browser window and see how the page reacts and fits the new space. Any site that doesn’t work well with mobile layouts will suffer in the rankings but to help designers Google offer a free test for any page to give their reaction to its mobile suitability. See the link here. Simply enter the page’s URL and Google will analyse it for mobile friendliness.

Waiting For The Visitors
Even with a well-designed, interesting and useful webpage don’t expect the visitors to come flooding in immediately. It can take months for the SEs to find a site, analyse and rank it and there’s little chance it will appear at the top of the results early on. That takes time. This means the savvy website owner has to do their own promoting to get visitors in. Making use of social media to stir up interest and sharing links will get things moving and if the Internet likes what it sees then they’ll share the content and add inbound links. The SEs see this activity over time and this contributes to an ever-better rank.

Patience is key to a successful site. Using this time to continually add quality content also helps with the rank as the robots return constantly to each site and analyse how the site is developing. If there’s a growth of regular quality information this is a positive for the SEs. Sites that have little content that receive no attention or updates don’t do very well in the rankings.

Web Page Creation
Whilst optimisation is useful to understand, it might not be possible to apply all of the techniques to every website. The CreativesGo site was designed from the ground-up with responsive design and SEO built-in. This is a bespoke site so we have a lot of control over content, look and feel. However for owners of template-designed websites that operate on top of a third-party host like Wordpress applying SEO will require an understanding of how the content is used on their page. On our bespoke site, we know exactly where we're putting our H tag or metatag data for instance but when using a template it might not be clear if the data is applied in the same way. Some research of their approach might be required.

Conclusion
Search engine optimisation is now a critical design consideration for any web site owner. Success or failure will largely depend on how the SEs index a site so aiming for the best possible SEO is a must. It’s a complex topic that requires an understanding of many elements. No one thing makes a huge difference, it’s the sum of many well-constructed parts that leads to SEO success. There aren’t though any guarantees with SEO. One day the SEs like this, the next they like that. They are fickle to say the least which keeps any website designer firmly on their toes. We certainly haven't covered every component of good optimisation because it's actually a very big topic with some quite complex concepts that it isn't possible to drill in to here and also we just don't know what makes up all of the contributing factors, only the SEs know this but it’s fair to say that putting together a website using best practice offering useful information for the visitor will always be seen favourably by the SEs and can only contribute towards better rankings. We designed this article page with SEO in mind so if you're confident in your coding skills why not use your browser’s ‘view source’ option to look at this page’s HTML code and check out the optimisation in action? If you have any questions about this article why don't you add them in the comments section below and we'll do our best to answer them for you.

All images © Peter Hatter
Article Date - November 2017
Note we cannot be held responsible for the contents of sites you access from these links.
Disclaimer: All the advice in this article is informational only, offered as is and comes with no guarantee or warranty whatsoever.

Want more,more more to read?

Comments Guidelines

Home Legal Privacy

© 2017 Peter Hatter Photography Ltd. All rights reserved.