“Best SEO Practices for Developers: Put Your Skills to Work”
This article is for developers who want to take into account website promotion features and avoid redoing things after the end of development.
Traditionally, the development team discusses and creates a project’s architecture and business logic. But website promotion is postponed until the last stage and is handled by the SEO team. This approach to building a project can have fatal consequences. At the production stage or even after deployment, there’s often a need to make edits to the project. These edits may include changes to meta tags and attributes for links as well as improvements in the link structure.
We’ve prepared our own checklist with the main things you need to consider at the development stage.
Title tags and meta descriptions are bits of HTML code in the header of a web page. They help search engines understand the contents of a page.
<head> <title>What developers have to know about SEO</title> <meta name="description" content="Article for developers team who wants to create good optimized website."> </head>
You may think, What else is there to add?
To begin with, you see snippets in the search engine results. We talked about this in our SEO web development tutorial for beginners. Snippets usually contain a title and description.
Usually, but not always. At times, when a search robot identifies a title as irrelevant to the text on the page, the snippet can be automatically generated based on the page content.
At first glance, this is a problem for copywriters and SEO experts rather than for developers. However, there are cases when developers should not only add the <title> tag to the page but should also generate a title and description automatically on the backend depending on the logic of the project.
It’s best for developers to generate unique titles and descriptions. But this doesn’t guarantee that they’ll appear in the snippet. For example, if the titles differ by one word only, there’s a chance that the search robot will mark them as irrelevant.
Examples: The best cake shops in New York The best cake shops in Stamford The best cake shops in Manchester
One tool that can help you manage titles and descriptions in Django is django-meta.
Meta tags are hidden HTML tags that are placed in the <head> of your web pages. HTML tags provide information about your website or web page to search engines, such as title, description, keywords, robots, copyright, and language. Search engines usually index meta tags to use them in search results. Without meta tags, your site can’t reach readers organically when they type something directly into a search engine.
Microdata is a so-called single language that’s understood and interpreted by Google, Yandex, and Yahoo search robots. The search engines came up with this language back in 2011. With microdata, you can show search robots that specific text or other elements on a page are important and belong to a certain type of data (search robots can’t interpret the meaning of content and set priorities).
For example, the About page of your site or the Contacts page should use microdata to mark the block with contact details, in which case a search robot will show “contact details of X company” in response to a user’s query.
Google has released a Structured Data Testing Tool that helps you check how well your website data is structured.
Robots.txt is a text file that you put on your site to tell search robots which pages they can index and which ones you would like them not to index.
You don’t have to use a robots.txt file. If you don’t, all pages of your website will be available for search robots.
You should bear in mind, though, that if you close certain pages of your website to search crawlers they can still be indexed if search robots find links to these pages on other sites or on pages of your website that are open for a search crawler. To truly block a URL from being indexed, you can use the "noindex" directive.
<meta name="robots" content="noindex">
The robots.txt file should always be at the root of your domain. Robots.txt also allows you to tell search engines where to find your sitemap.
In Django, you can use the django-robots app to manage the robots.txt file.
A sitemap is an XML file on your website that tells search engine indexers how frequently your pages change and how important certain pages are in relation to other pages on your site. This information helps search engines index your site.
Search engines read the sitemap to crawl your site more intelligently. In the sitemap, you can provide information about files you think are important, when a page was last updated, how often a page is updated, and so on.
A sitemap isn’t compulsory. However, pay attention when your site has a large archive of content pages that are isolated or not well linked to each other.
If your website contains a search form or filters, you can often get to the results pages only through this form. There are generally no links that direct you to these pages. If you consider these pages important and want search engines to index them, add links to page results in sitemap.xml.
Django comes with a high-level sitemap generating framework.
There are also tags such as nofollow and noindex. These are part of the Robot Exclusion Protocol (REP) that controls how web pages are indexed.
Nofollow doesn’t let a search robot follow a link on your site. Nofollow shows that search systems shouldn’t follow the links on a page and that there’s no need to scan the corresponding URLs. This means that search engines send neither Pagerank nor the link text.
For example, <meta name="robots" content="nofollow" /> or <a hrefs=”https://www.examples.com” rel=”nofollow”>Link teхt</a>
You can set up your page in such a way that all outgoing links are nofollow and don’t carry the weight of the page. But when you need a dofollow link, you can input it manually through your CMS. Some CMSs are capable of doing this automatically.
Noindex closes pages from being indexed. This meta tag can be added to the initial code of the HTML webpage and tells search engines not to index a specific page in the search results.
Add "noindex" to the HTML code to turn regular links into noindex links:
<a href="http://www.example.com" rel="noindex">Link text</a>
A canonical link is an HTML element that helps webmasters prevent duplicate content issues.
There are often situations when the same content is available on a website through several links. At the development stage, you may tackle this if you already know that some pages can relate to different categories. For example, on an ecommerce site products might be featured in several categories. The presence of duplicate content on your website has a bad influence on the ranking, since search engines don’t know which version to rank for search results.
The solution to this problem is to mark all duplicate pages with the rel="canonical" link element.
<link rel="canonical" href="http://example.com/origin-content/" />
Alt text, also known as an alt attribute, alt description, or alt tag, is used in HTML code to describe the interface elements and images on the page. You need to give every image an SEO-friendly alt tag.
You can write alt text for images automatically or manually.
Alt text is used:
- To make the internet more accessible, for instance to the visually impaired;
- If the file image can’t be loaded;
- To ensure a better description of images for search robots, helping them index images correctly.
Use both specific subjects and context when writing alt text.
- Let your alt text be no more than 125 characters.
- Don’t start alt text with “picture of…” or “image of…”
- Use a few important keywords, but sparingly.
- Don’t add your keyword to every single alt text.
- If an image doesn’t have any value, it should live within the CSS, not HTML.
- Use the longdesc="" tag for more complex images that require longer descriptions.
Let’s demonstrate a good and bad example of an alt text.
Bad: alt="Woman pointing to a person's computer screen."
Good: alt="Business school professor pointing to a female student’s computer screen."
In this section, we’ll look at H1, H2, and H3 meta tags (H stands for heading) and how they influence the optimization of your website. If these meta tags are placed incorrectly on your page, then it will be difficult for search engines to receive information about your website. Headings ensure the ranking of your website and order the HTML code on your page.
Each page of your website should contain only one H1 heading. It should be different from all other headings on your website pages. H2 and H3 headings aren’t compulsory, but if you do use them, they shouldn’t duplicate content. You can use a couple of H2 and H3 headings on one page, though.
Keep in mind that:
- Headings should include a keyword to rank your website higher in the search results.
- Each heading should be unique. If they repeat, a search engine may not rank them.
- If the heading is close to HTML code, then its value grows in relation to all other elements.
- The keyword should be as close as possible to the beginning of the heading.
- The length of a heading should be no more than 60 characters.
- There should be no grammatical errors in the heading.
- Headings are checked for re-optimization, so they shouldn’t be overwhelmed with keywords.
Successful optimization of your website depends on how tags are written. Therefore, you should keep the following rules in mind when you write tags:
- Keep to the hierarchy of tags (H1 should be higher than all other headings).
- Keep to the hierarchy of fonts (the higher the heading, the bigger the font).
- Stick to text elements in headings and don’t link to other sources from a heading.
- Don’t overuse H1, H2, and H3 headings. Search robots may think your page is spam if you use a large number of headings to highlight important parts of your page.
- You can use a picture as your heading. For example, you can use a picture with your logo that’s optimized accordingly.
A redirect is a way to send both users and search engines to a different URL from the one they originally requested.
Redirects don’t have a bad influence on SEO. However, a poor implementation might cause all kinds of trouble from loss of page ranking to loss of traffic.
Redirects are needed when you delete a post or change your URL structure.
In Django, you can do redirects automatically in code or you can give users the opportunity to redirect manually.
Django comes with an optional redirects application.
Page speed is the time it takes to load the contents of your website. You can check how fast your website loads on Google’s PageSpeed Insights. You can also try to optimize the loading time.
The page speed signals the page rank. A slow page speed means that search engines can scan a smaller number of pages using the given crawl budget, which may negatively influence your indexation.
The page speed is also important for users. As a rule, pages with a lengthy loading time are likely to be abandoned faster, decreasing the average time on a web page. High page loading times have a negative influence on conversions.
What can you do to increase the page speed?
- Turn on file compression (preferably while keeping the quality)
- Reduce redirects
- Leverage browser caching
- Improve server response times
- Use a content distribution network
- Optimize images
A URL is a human-readable text that replaces the numbers (IP addresses) that computers use to communicate with servers and identify the file structure on a website. A URL consists of a protocol, domain name, and path and has the following basic format: protocol://domain-name.top-level-domain/path.
Things to know about URLs:
- A URL shows the structure of the site.
- Only lowercase letters are used in human-readable URLs.
- A human-readable URL needs to be shorter than 90 characters.
You need to create SEO-friendly URLs (website links). Keep your URLs short and use the search term you want to rank for in your path. URLs need to change and be relevant on every page.
You can find more information about URLs and an example in our SEO web development tutorial.
Perhaps you’ve added or changed a title and meta description according to all recommendations from SEO experts, but the snippet hasn’t changed. Or you’ve added a new page to the sitemap but it doesn’t show up in search results.
This happens when you’ve deployed your website to production and a search engine has already indexed the content. After this, you took up optimization, added meta tags, and refreshed the production site. Having done this, you need to wait until the search engine indexes your website again. Google indexing can be initiated from the console. Django developers can ping Google using the management command:
python manage.py ping_google [/sitemap.xml]
After this, the search results will be refreshed in 10 to 14 days.
We hope that our recommendations will be useful, especially when you’re just starting your project, and will help you save tons of time on changing the logic when it’s time to promote your website.
If you want to build a website that gets high rankings from search engines like Google, get in touch with our sales team!