On-page and technical SEO are two of the most important aspect of SEO. Without them, your site can’t rank on search engines. How to do on-page and technical SEO is what you are going to learn in this article.
What is On-page SEO
On-page SEO is the practice of optimizing web pages on your site to improve your site’s ranking and get more traffic from search engines. The On-page refers to both the content and HTML source code (technical) of a page that can be optimized.
Why do On-page SEO
The optimization of the pages can improve search rankings, increase traffic to your site and increase conversions. The results of the search engine optimization take time, but once your on-page SEO strategy is properly done, it can skyrocket your rankings and your online sales.
Recommended>>> Best Affordable Link Building Services
On-page SEO Checklist
In my article “11 Tips On How to Write a Good Blog Post“, I stated clearly how to write SEO friendly articles for your blog. Below is the summary of on-page (content) SEO::
- Do keyword research
- Make sure your target keywords are in the title and meta title
- URL should be readable and should contain target keywords
- Include images and videos if necessary and make the images have the keywords in the alt text
- Do a good SEO silo on your site. SEO silo is simply internal linking to the most important pages or posts on your site. It can be your money page.
- Do external linking if necessary.
- Use Synonyms
- Fix all broken links both external and internal.
- Post meta descriptions should contain target keywords.
- Also, the Homepage title tag should contain your site’s main keyword. It is not compulsory but necessary.
- Homepage Meta Description- This is the most overlooked aspect of SEO. In one of the sites (exact matched domain), I worked on, optimizing the homepage meta description attracted huge organic traffic to the site. The meta description should be a summary of what your site is all about. Make sure important keywords are included in the summary.
- Optimize your categories and tags. This is the most overlooked aspect of a blog. In most sites I worked on, optimization of the tags and categories led to an increase in organic traffic. Be careful when you do that to avoid duplicate pages on your site
Of course, it would be almost impossible to optimize all your pages at once (especially if you have hundreds or even thousands of pages), so start with the best ones that have the greatest potential to attract web traffic.
What is Technical SEO
Technical SEO is part of on-page SEO, which is done on the elements of your site apart from the content.
Technical SEO Checklist
This is the most important aspect of SEO. If your site has technical issues, it will very difficult for search engine spiders to crawl and index your web pages. If your site can be crawled or indexed, then you can’t rank on search engines. Technical Seo consists of:
1. WEBSITE SPEED
This is one of Google’s ranking factors, people still tend to ignore. Keep in mind that site speed affects not only SEO but also UX (user experience), as users simply hate sites that load too slowly.
Many elements have a significant impact on page loading speed – so once you find something that slows down page loading time, try optimizing it or ask your web developer for help, as this often requires more advanced knowledge.
Key elements that affect the site’s speed:
- unoptimized images,
- slow server response time (SRT),
- Hosting service. See web hosting services I recommend
How Can I Check The Page Loading Speed?
Google has prepared a special tool called PageSpeed Insights, which will tell you exactly which elements of your site need to be optimized to make pages faster.
Follow the PageSpeed Insights diagnostic guidelines and you’ll effectively improve your page loading speed. Although not all recommendations can be implemented, some of them are insignificant.
You need to make sure that your site is mobile-friendly. Millions of users around the world use mobile phones to access web pages, so if you don’t optimize your site for mobile devices, you will lose a lot of opportunities for mobile traffic from search engines.
To have a mobile-friendly site, you need to have a responsive design. If you use WordPress, there are a lot of themes both free and premium that are great on mobile.
How Do I Check My Site Mobile Friendliness
Google has introduced a mobile-friendly testing tool where you can quickly check whether your site is mobile-friendly, and if not, it will indicate which elements should be improved. This can be found on your Google Search Console account (https://search.google.com/test/mobile-friendly)
In addition, you will find a coverage report in the Google Search Console. It will show you if your site has any problems with mobile crawling.
3. INDEXING ERRORS
If Google can’t crawl your pages, they won’t show up in the search results. This is why you should identify all indexing errors and other technical issues that prevent Google robots from accessing all pages.
How Do I Check For Indexing Errors?
There are some great ways to check if all your pages are crawled and indexed:
This is by far the fastest and easiest way that doesn’t require any advanced knowledge or special tools. That’s all you have to do is type the following command into Google:
site:yoursite.com as in site:ideapify.com
And Google will show you exactly how many pages are indexed. This is a great solution for sites that have no more than a few hundred pages – the results for huge sites may not be completely accurate.
2. Google Search Console
When discussing mobile-friendliness, the Google Search Console coverage report was mentioned, and that’s where you can find all the information about indexing issues.
You should check this report regularly and fix all indexing issues as soon as possible (for example, 404 errors and 500 errors). Otherwise, you may lose valuable web traffic.
3. Screaming Frog SEO Spider Tool
Screaming Frog SEO Spider Tool is a great tool that will crawl your entire website and identify all errors. You can also check the status code of your pages there – ideally, you should only see the HTTP 200 status response code, but if Screaming Frog shows 404 or 5xx status codes, then you have something to worry about.
You can use the free version to check up to 500 URLs.
When talking about indexing, you can’t ignore redirects either. In general, you can have 4 versions of your site, these are:
The problem starts when Google doesn’t specify which version of your site is the main one. As a result, the search engine indexes several versions, as it considers them as separate sites.
Fortunately, there is a solution to this problem – you can implement a 301 redirect, which will always bring users and Google searchers to the main version of your site.
Google wants users to feel safe while browsing the web pages, so the search engine prefers sites that encrypt data and ensure complete safety. Currently, almost every online store and other site where users can enter their confidential data must have an SSL certificate implemented.
So if you still haven’t switched from HTTP to HTTPS, you should do so as soon as possible. Keep in mind that Google has officially confirmed HTTPS protocol as a ranking signal.
6. XML SITEMAP
A sitemap is an optional but highly recommended file that contains all the pages on your site. This file (XML sitemap ) can boost your SEO, as the sitemap shows Google crawlers the correct architecture of your site, which certainly makes indexing easier and ensures that Google doesn’t miss any of your pages. See how to generate and submit your sitemap GSC.
Keep in mind that if you run your site in multiple language versions, you must create a separate sitemap file for each of them.
This is another file that you should never ignore. So what robots.txt. what is the file for?
If you don’t want Google search robots to have access to all pages on your site and not index them, you can specify such pages in the section robots.txt and block them from indexing.
Robots.txt This is especially useful for online stores and other sites where users log in to access their dashboards. Generally speaking, the list of sites that you want to block from crawling depends on your site type. Using SEO plugins like Rank Math can help in the process.
How Do I Check Robots.txt?
1. Web address
The first method is very similar to the one I recommended for sitemap XML. You can simply check this file by entering the following URL:
You will get a list of pages that are blocked from indexing.
2. SEO Tools
Many SEO tools or Chrome extensions also give you the ability to quickly check out robots.txt the file. SEO META in 1 CLICK is a great extension that shows the most important data on the page, including robots.txt.
8. STRUCTURED DATA
Google just loves structured data – it helps crawlers correctly interpret the most important elements of your site. As a result, Google will display much more information in your snippet, which will make it more impressive and attractive.
The type of structured data that you implement depends primarily on what kind of site you have.
There are really many schemes that you can implement! Make sure that the structured data on your site is entered correctly, and correct errors if necessary. On WordPress, there are a lot of plugins to do this, for example, Rank Math plugin.
If you enjoyed this article, then please subscribe to our YouTube Channel for WordPress blogging video tutorials and how to make money online. If you have any questions, please leave us a comment. Also, If you think I missed a step or have any suggestions for us; please let us know. We'd love to hear from you!