When you spend months or years on a website, not to mention thousands of dollars, it’s hard to step back and look at it objectively. Can you look at it through the eyes of your users? Can you look at it the way Google does?
If you can look at your website the way Google does, you’ll probably discover areas in which your website needs work. So in that spirit, I’m going to teach you how you can see your website from Google’s perspective, and how you can then target the areas that need improvement.
First, Google finds your website
In order to see your website, Google needs to find it.
When you create a website, Google will discover it eventually. The Googlebot systematically crawls the web, discovering websites, gathering information on those websites, and indexing that information to be returned in searching.
You can help the Googlebot with this process, and you should. If you go through the steps below, your site will get indexed faster.
- First, create a sitemap – a sitemap is a special document created just for search engines. If your site lacks a sitemap, you need to install one. WordPress users can install the Google Sitemap Generator for an easy but effective way to create one. Otherwise, you can use sites like xml-sitemaps.com to generate one. You’ll need to upload the site map file to your root directory.
- Submit your website to Google Webmaster Tools – Google Webmaster Tools is the go-to resource for lots of valuable information. As a first step, you should sign your site up with Google Webmaster Tools to ensure that it’s being indexed and returned by Google. Once you’ve done this, it’s time to add your sitemap. In Google Webmaster Tools, click on your site. Then, navigate to “Crawl” and then “Sitemaps”. If there is no sitemap, click “Add/Test Sitemap” in the upper right corner. Add the sitemap you created in the step above.
- Go the extra mile – if you want to go the extra mile, you can ask Google to index your site on the Webmaster Tool URL submission page.
Now that Google sees your website you need to find out what they’re going to see when they look at it.
Google doesn’t look at anything blocked by robots.txt
Your Robots.txt tells Google, “you can look at these sections of my site, but not these other sections.”
If there are any files, pages, or directories on your site that are disallowed, then Google will comply with your request by not indexing them. As a result, these pages will not be returned in search engine results.
A robots.txt can be as simple or complicated as you want. I recommend the simple approach.
All you need is a text file on your web server, titled robots.txt. Before looking at your site, it checks to see what it has permission to crawl. When Google comes crawling your site, it will look for this file.
The following robots.txt says, “Yes, go ahead. Crawl everything.” That’s because there is nothing listed after “disallow.”
If you list anything after “Disallow:” then Google will not index that page or directory. For example:
How can you tell if you have a robots.txt file?
Type in your URL, followed by /robots.txt. If you see a page like this, it’s a robots.txt file.
You can also find out if Google is fetching robots.txt information by checking the main dashboard of your Google Webmaster Tools account.
If you have no robots.txt file, then Google will simply crawl and index your entire site. That might be okay. But if there are sections of your site that you prefer not to be crawled or indexed, you should create a robots.txt file, and list them.
Using a robots.txt can be a great way to block off sections of your website that are creating duplicate content. For instance, some content management systems will create extra pages for comments or unnecessary variations of webpages. If that is happening to your website, then block them off in your robots.txt file.
Google looks at your title
After checking your robots.txt file, Google looks at a nugget of content on your website — the page title. It’s a meta tag, enclosed by <title> in your page’s HTML code.
Here’s how Google sees your title tag:
- Google sees your entire title, but only 65 characters matter. Those 65 characters is what a user will see when your page appears in the search engine results pages (SERPs). For Quick Sprout, Google see’s “Quick Sprout – I’m Kind of a Big Deal” as shown in the image above.
- Google sees all the titles on all your pages, and wants each one to be unique. Don’t give every page on your site the same title.
- Google sees the keywords in the title, but doesn’t want to see keyword stuffing. In other words your title tag should contain more than just keywords.
Google looks at your description
After looking at your title, Google moves on to your page description.
The page description is another meta tag, this time enclosed by the <description> tag.
Although Google indexes your meta description, it doesn’t use it as a ranking factor. In other words, you’re not going to get higher rankings with some sizzling genius of a keyword-focused description tag. A description is there for your users, and Google looks at the description to display to users on the SERP.
You should include a meta description on each page, recognizing that Google looks at it, but doesn’t depend on it as a major ranking factor.
Make sure you write a brief meta description for each page, make it roughly 160 characters, and most important write it for humans and search engines.
Remember! This description is probably the only free advertising copy you get on the entire planet. So don’t gloss over it. Spend time crafting compelling descriptions that will increase your click-through rate.
Google can’t see what your images look like, but they look at alt tags to see what the image is about
Google is pretty good at looking at just about everything on your site. They’re even good at searching by image.
But for your site, alt tags on images are what truly matters. This is what Google indexes and uses as part of the search returns. Even though it can’t necessarily “see” the image, it sees what you provide in the alt tag.
An alt tag is part of the HTML code for any image on your site.
<img src=”http://www.example.com/picture.png” alt=”Keyword Phrase”>
The “keyword” in the line above is where you should provide a brief description of your image. Every single image needs this, and it needs to be a description that is related to the actual image.
In some CMSs, the alt tags are automatically generated. Sometimes, they are applied indiscriminately by someone who is not aware of proper alt tagging practices. The tags end up looking like this:
These are useless.
Instead, you should create alt tags for images that are descriptive, keyword smart, and useful. To make sure you have alt tags, simply look at your HTML code to see if there is information after alt= for each image.
You may want to check your robots.txt to make sure that you’re not disallowing images from being crawled on the site. Once you have images all nicely alt tagged, go ahead and let the crawlers loose on this valuable source of rank potential.
Here is an example of what you should not have on your robots.txt.
What you need to do is add descriptive and keyword-sensitive alt tags to every image on your site. If you run an ecommerce website be sure you include any model or serial numbers in your alt tags.
Google looks at content
Now we’ve come to the most important aspect of what Google sees… your content.
By “content,” I mean everything that is unfettered by HTML code and displayed for users in all its glorious brilliance — undaunted by length requirements, and free to convey powerful, compelling, and useful information.
Content! I love it.
Every single word on your page is seen and indexed by Google. And the more content you have, the better. Google is not limited in how many pages of your website it will crawl, index, and return. It reaches everything.
I recommend that you have plenty of content on every page. You can’t go wrong with lots of content. But please, make sure that it’s quality content.
Not only does Google look at all your content, but they turn their Googlebot heads whenever you publish new content. That is why I insist on producing epic-length content every week, and why I suggest that you do the same.
Google’s fresh factor suggests that regular, new, and awesome content will get the Google look.
Taking a fresh look at your website from Google’s perspective helps you to break out of the rut of obsessing over the same old things on your site.
It’s likely that Google cares about some factors that are different from what you keep looking at. What I’ve listed above are the most important of those factors.
Why don’t you take a few minutes with your website, going through each of these factors. See if you can catch a glimpse of what Google is seeing. You might really improve your rankings.
What helps you look at your website the way Google does?
About the Author: Neil Patel is the cofounder of Neil Patel Digital.